sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
3ecef7d5c38d205b9b5b207bc5d0c83dd840cdf0
# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b-v0.2](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-20T21:45:30.967248](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b-v0.2/blob/main/results_2024-01-20T21-45-30.967248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6661893043024104, "acc_stderr": 0.03160529125685929, "acc_norm": 0.6670254575133359, "acc_norm_stderr": 0.0322488639207885, "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7167756036606316, "mc2_stderr": 0.01508488302488436 }, "harness|arc:challenge|25": { "acc": 0.681740614334471, "acc_stderr": 0.013611993916971453, "acc_norm": 0.7090443686006825, "acc_norm_stderr": 0.013273077865907593 }, "harness|hellaswag|10": { "acc": 0.7092212706632145, "acc_stderr": 0.004531935391507008, "acc_norm": 0.8828918542123083, "acc_norm_stderr": 0.0032089195103099334 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595853, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595853 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361072, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361072 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6340425531914894, "acc_stderr": 0.031489558297455304, "acc_norm": 0.6340425531914894, "acc_norm_stderr": 0.031489558297455304 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6206896551724138, "acc_stderr": 0.040434618619167466, "acc_norm": 0.6206896551724138, "acc_norm_stderr": 0.040434618619167466 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47619047619047616, "acc_stderr": 0.025722097064388535, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.025722097064388535 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8161290322580645, "acc_stderr": 0.022037217340267836, "acc_norm": 0.8161290322580645, "acc_norm_stderr": 0.022037217340267836 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8737373737373737, "acc_stderr": 0.02366435940288023, "acc_norm": 0.8737373737373737, "acc_norm_stderr": 0.02366435940288023 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603347, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603347 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633506, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633506 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7100840336134454, "acc_stderr": 0.029472485833136094, "acc_norm": 0.7100840336134454, "acc_norm_stderr": 0.029472485833136094 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5601851851851852, "acc_stderr": 0.0338517797604481, "acc_norm": 0.5601851851851852, "acc_norm_stderr": 0.0338517797604481 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250454, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250454 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8523206751054853, "acc_stderr": 0.0230943295825957, "acc_norm": 0.8523206751054853, "acc_norm_stderr": 0.0230943295825957 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.03492606476623791, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.03492606476623791 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.0225090339370778, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.0225090339370778 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7601156069364162, "acc_stderr": 0.022989592543123567, "acc_norm": 0.7601156069364162, "acc_norm_stderr": 0.022989592543123567 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4, "acc_stderr": 0.01638463841038082, "acc_norm": 0.4, "acc_norm_stderr": 0.01638463841038082 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.02451819564187933, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.02451819564187933 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694905, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694905 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7808641975308642, "acc_stderr": 0.02301670564026219, "acc_norm": 0.7808641975308642, "acc_norm_stderr": 0.02301670564026219 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4915254237288136, "acc_stderr": 0.012768401697269054, "acc_norm": 0.4915254237288136, "acc_norm_stderr": 0.012768401697269054 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.026679252270103128, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.026679252270103128 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6911764705882353, "acc_stderr": 0.018690850273595298, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.018690850273595298 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598052, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.030944459778533207, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.030944459778533207 }, "harness|truthfulqa:mc|0": { "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7167756036606316, "mc2_stderr": 0.01508488302488436 }, "harness|winogrande|5": { "acc": 0.835043409629045, "acc_stderr": 0.010430917468237431 }, "harness|gsm8k|5": { "acc": 0.6489764973464746, "acc_stderr": 0.013146945941397226 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b-v0.2
[ "region:us" ]
2024-01-20T21:47:47+00:00
{"pretty_name": "Evaluation run of macadeliccc/SOLAR-math-2x10.7b-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b-v0.2](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T21:45:30.967248](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b-v0.2/blob/main/results_2024-01-20T21-45-30.967248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6661893043024104,\n \"acc_stderr\": 0.03160529125685929,\n \"acc_norm\": 0.6670254575133359,\n \"acc_norm_stderr\": 0.0322488639207885,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7167756036606316,\n \"mc2_stderr\": 0.01508488302488436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971453,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907593\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7092212706632145,\n \"acc_stderr\": 0.004531935391507008,\n \"acc_norm\": 0.8828918542123083,\n \"acc_norm_stderr\": 0.0032089195103099334\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388535,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388535\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633506,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633506\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.0225090339370778,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.0225090339370778\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.02451819564187933,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.02451819564187933\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4915254237288136,\n \"acc_stderr\": 0.012768401697269054,\n \"acc_norm\": 0.4915254237288136,\n \"acc_norm_stderr\": 0.012768401697269054\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595298,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595298\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7167756036606316,\n \"mc2_stderr\": 0.01508488302488436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237431\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \"acc_stderr\": 0.013146945941397226\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|arc:challenge|25_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|gsm8k|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hellaswag|10_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T21-45-30.967248.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["**/details_harness|winogrande|5_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T21-45-30.967248.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T21_45_30.967248", "path": ["results_2024-01-20T21-45-30.967248.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T21-45-30.967248.parquet"]}]}]}
2024-01-20T21:48:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b-v0.2 Dataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b-v0.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-20T21:45:30.967248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b-v0.2\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-20T21:45:30.967248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b-v0.2\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-20T21:45:30.967248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a341a0b1f23b7ebe16f37253b771639f0824c4b1
# Dataset Construction The `paraphrased questions` are generated by [Prompt Craft Toolkit](https://github.com/SuperBruceJia/promptcraft). # Dataset Usage ```python from datasets import load_dataset # Load dataset dataset = load_dataset("shuyuej/gsm8k_testing_promptcraft_generated") dataset = dataset["test"] print(dataset) ``` # Citation If you find our toolkit useful, please consider citing our repo and toolkit in your publications. We provide a BibTeX entry below. ```bibtex @misc{JiaPromptCraft23, author = {Jia, Shuyue}, title = {{PromptCraft}: A Prompt Perturbation Toolkit}, year = {2023}, publisher = {GitHub}, journal = {GitHub Repository}, howpublished = {\url{https://github.com/SuperBruceJia/promptcraft}}, } @misc{JiaAwesomeLLM23, author = {Jia, Shuyue}, title = {Awesome {LLM} Self-Consistency}, year = {2023}, publisher = {GitHub}, journal = {GitHub Repository}, howpublished = {\url{https://github.com/SuperBruceJia/Awesome-LLM-Self-Consistency}}, } @misc{JiaAwesomeSTS23, author = {Jia, Shuyue}, title = {Awesome Semantic Textual Similarity}, year = {2023}, publisher = {GitHub}, journal = {GitHub Repository}, howpublished = {\url{https://github.com/SuperBruceJia/Awesome-Semantic-Textual-Similarity}}, } ```
shuyuej/gsm8k_testing_promptcraft_generated
[ "license:apache-2.0", "region:us" ]
2024-01-20T22:02:54+00:00
{"license": "apache-2.0"}
2024-01-25T19:43:06+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
# Dataset Construction The 'paraphrased questions' are generated by Prompt Craft Toolkit. # Dataset Usage If you find our toolkit useful, please consider citing our repo and toolkit in your publications. We provide a BibTeX entry below.
[ "# Dataset Construction\nThe 'paraphrased questions' are generated by Prompt Craft Toolkit.", "# Dataset Usage\n\n\nIf you find our toolkit useful, please consider citing our repo and toolkit in your publications. We provide a BibTeX entry below." ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "# Dataset Construction\nThe 'paraphrased questions' are generated by Prompt Craft Toolkit.", "# Dataset Usage\n\n\nIf you find our toolkit useful, please consider citing our repo and toolkit in your publications. We provide a BibTeX entry below." ]
1737e64909dd0fd296514d5293f1bbdd4b5b5fb7
# Dataset Card for single-cell multiome from bone marrow <!-- Provide a quick summary of the dataset. --> Single-cell multiomics data collected from bone marrow mononuclear cells of 12 healthy human donors. ## Dataset Details Multimodal data as a basis for benchmarking "Developing machine learning methods for biological systems is complicated by the difficulty of obtaining ground truth. Typically, machine learning tasks rely on manual annotation (as in images or natural language queries), dynamic measurements (as in longitudinal health records or weather), or multimodal measurement (as in translation or text-to-speech). However, this is more complicated in the context of single-cell biology. With single-cell data, annotation isn’t feasible. The data is noisy and not fully understood with descriptions of cell types evolving rapidly. Similarly, longitudinal measurement of all the RNA in a cell isn’t possible because the current measurement technologies involve destroying the cell. However, with multimodal single-cell data, we can now directly observe two layers of genetic information in the same cells. This provides an opportunity to use the fact these two sets of data were observed co-occurring in the same cells as ground truth. This is akin to the way that access to the same sentiment expressed in two languages provides ground truth for machine translation. However, as these technologies are relatively new, most publicly available datasets are designed for exploration, not benchmarking. To set up a competition for multimodal single-cell data integration, we set out to create a fit-for-purpose benchmarking dataset." ### Dataset Description The study design is as follows: Multiome Site 1 - Donors 1, 2, 3 Site 2 - Donors 1, 4, 5 Site 3 - Donors 3, 6, 7, 10 Site 4 - Donors 1, 8, 9 - **Curated by:** Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM - **License:** MIT ### Dataset Sources <!-- Provide the basic links for the dataset. --> - **Repository:** https://github.com/openproblems-bio - **Paper:** https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/hash/158f3069a435b314a80bdcb024f8e422-Abstract-round2.html ## Uses <!-- Address questions around how the dataset is intended to be used. --> Challenges included modality prediction, matching profiles from different modalities, and learning a joint embedding from multiple modalities. ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> The training data is accessible in an AnnData h5ad file. More information can be found on AnnData objects here. You can load these files is to use the AnnData.read_h5ad() function. The dataset was designed with a nested batch layout such that some donor samples were measured at multiple sites with some donors measured at a single site. ## Dataset Creation Joint profiling of single-nucleus RNA and chromatin accessibility using the 10X Genomics Single Cell Multiome ATAC + Gene Expression Kit #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> To facilitate exploring the data, each dataset has been preprocessed to remove low quality cells and doublets. The following sections detail this process for each data modality. Preprocessing of gene expression (GEX) In this dataset, gene expression was measured using 3’ capture of nuclear RNA as described in the 10X Multiome Product Guide. Note, not all RNA is found in the nucleus. Comparisons of nuclear and cytosolic RNA have been previously reported (e.g. Bakken 2018; Abdelmoez 2018) as have comparisons of single-nucleus and single-cell RNA sequencing (Lake 2017). For gene expression data, cells were filtered based on mitochondrial content, UMI counts per cell, and genes detected per cell. Size factors were then calculated using scran and stored in adata.obs["size_factors"]. Counts were then normalized per cell by divided the UMI counts by the size factors. Original counts are stored in adata.layers["counts"]. The size factor normalized counts are stored in adata.X. Finally, normalized counts are log1p transformed. These normalized counts are stores in adata.layers["log_norm"]. More information about best practices for single-cell analysis can be found here. Preprocessing of ATAC The chromatin accessibility data acquired by ATAC-seq as part of the 10X Multiome protocol was processed using Signac. Quality control, dimensionality reduction and translating peaks to gene activity scores was performed using Signac, following the authors’ instructions. After loading the peak-by-cell matrix, counts were binarized to only represent an accessible versus non-accessible state of each region. Cells were then filtered based on 5 quality control metrics comprising the total number of fragments, the enrichment of fragments detected at transcription start sites (TSS), the fraction of fragments in peak regions compared to peak-flanking regions, the fraction of peaks blacklisted by the ENCODE consortium, and the nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it. Since ATAC data is sparser than gene expression data, peaks were included if they were accessible in at least 15 cells. Finally, the data was binarized by setting all values >0 to 1 and stored in adata.X. Raw UMI counts for each peak can be found in adata.layers["counts"]. Preprocessing of protein abundance (ADT) The protein data was measured using the TotalSeq™-B Human Universal Cocktail, V1.0 of 134 cell surface markers and 6 isotype controls. The isotype controls are stored in adata.obsm["isotype_controls"]. These controls do not target any human proteins and their expression should be considered background. The ADT protein measurements were run through quality control based on the total number of ADTs (ranging from 1100-1200 to 24000 across samples), the number of proteins captured in each cell (with a lower limit of 80) and the ADT count of the 6 isotype controls summed up in each cell (ranging from 1 to 100). Since the total number of captured ADTs is limited, absolute ADT counts appear to be lower if highly abundant proteins are present. To account for this effect, normalization was performed using the centered log ratio (CLR) transformation. CLR counts are stored in adata.X and the raw counts are stored in adata.layers["counts"]. #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> Metadata More information about the features are available in the .var and .obs DataFrames of each object. Gene expression observation metadata The GEX adata objects have the following columns: .obs.index - The cell barcode for that observation with the batch label appended. .obs["n_genes_by_counts"] - The number of genes with at least 1 count in a cell. .obs["pct_counts_mt"] - Percent of UMI counts mapped to mitochondrial genes. .obs["n_counts"] - Number of UMIs detected in the cell .obs["n_genes"] - Number of genes detected in the cell .obs["size_factors"] - The estimated size factor for the cell. See OSCA Ch. 7 - Normalization .obs["phase"] - The cell cycle phase for each cell as calculated by scanpy.tl.score_genes_cell_cycle .obs["leiden_final"] - .obs["atac_ann"] - The cell type annotation of the cell from the joint ATAC data .obs["cell_type"] - The cell type annotation of the cells from the GEX data .obs["pseudotime_order_GEX"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data. .obs["batch"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1. For more info on how the QC metrics were calculated, consult scanpy.pp.calculate_qc_metrics Gene expression feature metadata The GEX adata.var DataFrames have the following columns: .var.index - Ensembl Gene Names for each gene .var["gene_ids"] - Ensembl Stable IDs used to uniquely track genes whose Gene Names may change over time. .var["feature_types"] - Denotes the each feature as a gene expression feature. Should be GEX for all genes .var["genome"] - The Genome Assembly used for read mapping. .var["n_cells-[batch]"] - The number of cells in [batch] in which the gene was detected. .var["highly_variable-[batch]"] - Whether the gene was determined to be highly variable in [batch] ATAC observation metadata The ATAC adata.obs DataFrames have the following columns: .obs.index - The cell barcode for that observation with the batch label appended. .obs["nCount_peaks"] - The number of peaks detected in the cell. .obs["atac_fragments"] - Number of UMI counts in the cell (both in and not in peaks) .obs["reads_in_peaks_frac"] - Fraction of UMIs in peaks .obs["blacklist_fraction"] - Fraction of UMIs in Encode Blacklisted regions .obs["nucleosome_signal"] - The nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it .obs["phase"] - The cell cycle phase for each cell as calculated by scanpy.tl.score_genes_cell_cycle .obs["leiden_final"] - .obs["rna_ann"] - The cell type annotation of the cell from the joint RNA data .obs["cell_type"] - The cell type annotation of the cells from the ATAC data .obs["pseudotime_order_ATAC"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data. .obs["batch"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1. For more info on how the QC metrics were calculated, consult the Signac documentation. ATAC feature metadata The ATAC adata.var DataFrames have the following columns: .var.index - Genomic coordinates for each ATAC peak that are directly related to the reference genome, and include the chromosome name*, start position, and end position in the following format: chr1-1234570-1234870. .var["feature_types"] - Denotes the each feature as a gene expression feature. Should be ATAC for all peaks .var["n_cells-[batch]"] - The number of cells in [batch] in which the peak was detected. *For the curious, chromosome names like KI270726.1 represent scaffold that are either unlocalized or unplaced (see Genome Assemblies from Ensembl) There is also information about the observations in the .obs DataFrame of each AnnData object. ## Potential biases Cell type identification and doublet removal were already performed. Donors varied by age (22 - 40), sex, and ethnicity (details in the associated datasheet). #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/hash/158f3069a435b314a80bdcb024f8e422-Abstract-round2.html
paupaiz/Bone_Marrow_BMMCs
[ "task_categories:feature-extraction", "license:mit", "biology", "medical", "region:us" ]
2024-01-20T22:29:03+00:00
{"license": "mit", "task_categories": ["feature-extraction"], "tags": ["biology", "medical"]}
2024-02-02T05:14:14+00:00
[]
[]
TAGS #task_categories-feature-extraction #license-mit #biology #medical #region-us
# Dataset Card for single-cell multiome from bone marrow Single-cell multiomics data collected from bone marrow mononuclear cells of 12 healthy human donors. ## Dataset Details Multimodal data as a basis for benchmarking "Developing machine learning methods for biological systems is complicated by the difficulty of obtaining ground truth. Typically, machine learning tasks rely on manual annotation (as in images or natural language queries), dynamic measurements (as in longitudinal health records or weather), or multimodal measurement (as in translation or text-to-speech). However, this is more complicated in the context of single-cell biology. With single-cell data, annotation isn’t feasible. The data is noisy and not fully understood with descriptions of cell types evolving rapidly. Similarly, longitudinal measurement of all the RNA in a cell isn’t possible because the current measurement technologies involve destroying the cell. However, with multimodal single-cell data, we can now directly observe two layers of genetic information in the same cells. This provides an opportunity to use the fact these two sets of data were observed co-occurring in the same cells as ground truth. This is akin to the way that access to the same sentiment expressed in two languages provides ground truth for machine translation. However, as these technologies are relatively new, most publicly available datasets are designed for exploration, not benchmarking. To set up a competition for multimodal single-cell data integration, we set out to create a fit-for-purpose benchmarking dataset." ### Dataset Description The study design is as follows: Multiome Site 1 - Donors 1, 2, 3 Site 2 - Donors 1, 4, 5 Site 3 - Donors 3, 6, 7, 10 Site 4 - Donors 1, 8, 9 - Curated by: Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM - License: MIT ### Dataset Sources - Repository: URL - Paper: URL ## Uses Challenges included modality prediction, matching profiles from different modalities, and learning a joint embedding from multiple modalities. ## Dataset Structure The training data is accessible in an AnnData h5ad file. More information can be found on AnnData objects here. You can load these files is to use the AnnData.read_h5ad() function. The dataset was designed with a nested batch layout such that some donor samples were measured at multiple sites with some donors measured at a single site. ## Dataset Creation Joint profiling of single-nucleus RNA and chromatin accessibility using the 10X Genomics Single Cell Multiome ATAC + Gene Expression Kit #### Data Collection and Processing To facilitate exploring the data, each dataset has been preprocessed to remove low quality cells and doublets. The following sections detail this process for each data modality. Preprocessing of gene expression (GEX) In this dataset, gene expression was measured using 3’ capture of nuclear RNA as described in the 10X Multiome Product Guide. Note, not all RNA is found in the nucleus. Comparisons of nuclear and cytosolic RNA have been previously reported (e.g. Bakken 2018; Abdelmoez 2018) as have comparisons of single-nucleus and single-cell RNA sequencing (Lake 2017). For gene expression data, cells were filtered based on mitochondrial content, UMI counts per cell, and genes detected per cell. Size factors were then calculated using scran and stored in URL["size_factors"]. Counts were then normalized per cell by divided the UMI counts by the size factors. Original counts are stored in URL["counts"]. The size factor normalized counts are stored in adata.X. Finally, normalized counts are log1p transformed. These normalized counts are stores in URL["log_norm"]. More information about best practices for single-cell analysis can be found here. Preprocessing of ATAC The chromatin accessibility data acquired by ATAC-seq as part of the 10X Multiome protocol was processed using Signac. Quality control, dimensionality reduction and translating peaks to gene activity scores was performed using Signac, following the authors’ instructions. After loading the peak-by-cell matrix, counts were binarized to only represent an accessible versus non-accessible state of each region. Cells were then filtered based on 5 quality control metrics comprising the total number of fragments, the enrichment of fragments detected at transcription start sites (TSS), the fraction of fragments in peak regions compared to peak-flanking regions, the fraction of peaks blacklisted by the ENCODE consortium, and the nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it. Since ATAC data is sparser than gene expression data, peaks were included if they were accessible in at least 15 cells. Finally, the data was binarized by setting all values >0 to 1 and stored in adata.X. Raw UMI counts for each peak can be found in URL["counts"]. Preprocessing of protein abundance (ADT) The protein data was measured using the TotalSeq™-B Human Universal Cocktail, V1.0 of 134 cell surface markers and 6 isotype controls. The isotype controls are stored in URL["isotype_controls"]. These controls do not target any human proteins and their expression should be considered background. The ADT protein measurements were run through quality control based on the total number of ADTs (ranging from 1100-1200 to 24000 across samples), the number of proteins captured in each cell (with a lower limit of 80) and the ADT count of the 6 isotype controls summed up in each cell (ranging from 1 to 100). Since the total number of captured ADTs is limited, absolute ADT counts appear to be lower if highly abundant proteins are present. To account for this effect, normalization was performed using the centered log ratio (CLR) transformation. CLR counts are stored in adata.X and the raw counts are stored in URL["counts"]. #### Annotation process Metadata More information about the features are available in the .var and .obs DataFrames of each object. Gene expression observation metadata The GEX adata objects have the following columns: .URL - The cell barcode for that observation with the batch label appended. .obs["n_genes_by_counts"] - The number of genes with at least 1 count in a cell. .obs["pct_counts_mt"] - Percent of UMI counts mapped to mitochondrial genes. .obs["n_counts"] - Number of UMIs detected in the cell .obs["n_genes"] - Number of genes detected in the cell .obs["size_factors"] - The estimated size factor for the cell. See OSCA Ch. 7 - Normalization .obs["phase"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle .obs["leiden_final"] - .obs["atac_ann"] - The cell type annotation of the cell from the joint ATAC data .obs["cell_type"] - The cell type annotation of the cells from the GEX data .obs["pseudotime_order_GEX"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data. .obs["batch"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1. For more info on how the QC metrics were calculated, consult URL.calculate_qc_metrics Gene expression feature metadata The GEX URL DataFrames have the following columns: .URL - Ensembl Gene Names for each gene .var["gene_ids"] - Ensembl Stable IDs used to uniquely track genes whose Gene Names may change over time. .var["feature_types"] - Denotes the each feature as a gene expression feature. Should be GEX for all genes .var["genome"] - The Genome Assembly used for read mapping. .var["n_cells-[batch]"] - The number of cells in [batch] in which the gene was detected. .var["highly_variable-[batch]"] - Whether the gene was determined to be highly variable in [batch] ATAC observation metadata The ATAC URL DataFrames have the following columns: .URL - The cell barcode for that observation with the batch label appended. .obs["nCount_peaks"] - The number of peaks detected in the cell. .obs["atac_fragments"] - Number of UMI counts in the cell (both in and not in peaks) .obs["reads_in_peaks_frac"] - Fraction of UMIs in peaks .obs["blacklist_fraction"] - Fraction of UMIs in Encode Blacklisted regions .obs["nucleosome_signal"] - The nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it .obs["phase"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle .obs["leiden_final"] - .obs["rna_ann"] - The cell type annotation of the cell from the joint RNA data .obs["cell_type"] - The cell type annotation of the cells from the ATAC data .obs["pseudotime_order_ATAC"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data. .obs["batch"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1. For more info on how the QC metrics were calculated, consult the Signac documentation. ATAC feature metadata The ATAC URL DataFrames have the following columns: .URL - Genomic coordinates for each ATAC peak that are directly related to the reference genome, and include the chromosome name*, start position, and end position in the following format: chr1-1234570-1234870. .var["feature_types"] - Denotes the each feature as a gene expression feature. Should be ATAC for all peaks .var["n_cells-[batch]"] - The number of cells in [batch] in which the peak was detected. *For the curious, chromosome names like KI270726.1 represent scaffold that are either unlocalized or unplaced (see Genome Assemblies from Ensembl) There is also information about the observations in the .obs DataFrame of each AnnData object. ## Potential biases Cell type identification and doublet removal were already performed. Donors varied by age (22 - 40), sex, and ethnicity (details in the associated datasheet). #### Who are the annotators? Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM URL
[ "# Dataset Card for single-cell multiome from bone marrow\n\n\n\nSingle-cell multiomics data collected from bone marrow mononuclear cells of 12 healthy human donors.", "## Dataset Details\nMultimodal data as a basis for benchmarking\n\"Developing machine learning methods for biological systems is complicated by the difficulty of obtaining ground truth. Typically, machine learning tasks rely on manual annotation (as in images or natural language queries), dynamic measurements (as in longitudinal health records or weather), or multimodal measurement (as in translation or text-to-speech). However, this is more complicated in the context of single-cell biology.\nWith single-cell data, annotation isn’t feasible. The data is noisy and not fully understood with descriptions of cell types evolving rapidly. Similarly, longitudinal measurement of all the RNA in a cell isn’t possible because the current measurement technologies involve destroying the cell. However, with multimodal single-cell data, we can now directly observe two layers of genetic information in the same cells. This provides an opportunity to use the fact these two sets of data were observed co-occurring in the same cells as ground truth. This is akin to the way that access to the same sentiment expressed in two languages provides ground truth for machine translation.\nHowever, as these technologies are relatively new, most publicly available datasets are designed for exploration, not benchmarking. To set up a competition for multimodal single-cell data integration, we set out to create a fit-for-purpose benchmarking dataset.\"", "### Dataset Description\nThe study design is as follows:\n\nMultiome\nSite 1 - Donors 1, 2, 3\nSite 2 - Donors 1, 4, 5\nSite 3 - Donors 3, 6, 7, 10\nSite 4 - Donors 1, 8, 9\n\n- Curated by: Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM\n- License: MIT", "### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: URL", "## Uses\n\n\nChallenges included modality prediction, matching profiles from different modalities, and learning a joint embedding from multiple modalities.", "## Dataset Structure\n\n\n\nThe training data is accessible in an AnnData h5ad file. More information can be found on AnnData objects here. You can load these files is to use the AnnData.read_h5ad() function.\nThe dataset was designed with a nested batch layout such that some donor samples were measured at multiple sites with some donors measured at a single site.", "## Dataset Creation\nJoint profiling of single-nucleus RNA and chromatin accessibility using the 10X Genomics Single Cell Multiome ATAC + Gene Expression Kit", "#### Data Collection and Processing\n\n\n\nTo facilitate exploring the data, each dataset has been preprocessed to remove low quality cells and doublets. The following sections detail this process for each data modality.\n\nPreprocessing of gene expression (GEX)\nIn this dataset, gene expression was measured using 3’ capture of nuclear RNA as described in the 10X Multiome Product Guide. Note, not all RNA is found in the nucleus. Comparisons of nuclear and cytosolic RNA have been previously reported (e.g. Bakken 2018; Abdelmoez 2018) as have comparisons of single-nucleus and single-cell RNA sequencing (Lake 2017).\n\nFor gene expression data, cells were filtered based on mitochondrial content, UMI counts per cell, and genes detected per cell. Size factors were then calculated using scran and stored in URL[\"size_factors\"].\n\nCounts were then normalized per cell by divided the UMI counts by the size factors. Original counts are stored in URL[\"counts\"]. The size factor normalized counts are stored in adata.X.\n\nFinally, normalized counts are log1p transformed. These normalized counts are stores in URL[\"log_norm\"].\n\nMore information about best practices for single-cell analysis can be found here.\n\nPreprocessing of ATAC\nThe chromatin accessibility data acquired by ATAC-seq as part of the 10X Multiome protocol was processed using Signac. Quality control, dimensionality reduction and translating peaks to gene activity scores was performed using Signac, following the authors’ instructions. After loading the peak-by-cell matrix, counts were binarized to only represent an accessible versus non-accessible state of each region. Cells were then filtered based on 5 quality control metrics comprising the total number of fragments, the enrichment of fragments detected at transcription start sites (TSS), the fraction of fragments in peak regions compared to peak-flanking regions, the fraction of peaks blacklisted by the ENCODE consortium, and the nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it.\n\nSince ATAC data is sparser than gene expression data, peaks were included if they were accessible in at least 15 cells.\n\nFinally, the data was binarized by setting all values >0 to 1 and stored in adata.X. Raw UMI counts for each peak can be found in URL[\"counts\"].\n\nPreprocessing of protein abundance (ADT)\nThe protein data was measured using the TotalSeq™-B Human Universal Cocktail, V1.0 of 134 cell surface markers and 6 isotype controls. The isotype controls are stored in URL[\"isotype_controls\"]. These controls do not target any human proteins and their expression should be considered background.\n\nThe ADT protein measurements were run through quality control based on the total number of ADTs (ranging from 1100-1200 to 24000 across samples), the number of proteins captured in each cell (with a lower limit of 80) and the ADT count of the 6 isotype controls summed up in each cell (ranging from 1 to 100).\n\nSince the total number of captured ADTs is limited, absolute ADT counts appear to be lower if highly abundant proteins are present. To account for this effect, normalization was performed using the centered log ratio (CLR) transformation. CLR counts are stored in adata.X and the raw counts are stored in URL[\"counts\"].", "#### Annotation process\n\n\nMetadata\nMore information about the features are available in the .var and .obs DataFrames of each object.\n\nGene expression observation metadata\nThe GEX adata objects have the following columns:\n\n.URL - The cell barcode for that observation with the batch label appended.\n.obs[\"n_genes_by_counts\"] - The number of genes with at least 1 count in a cell.\n.obs[\"pct_counts_mt\"] - Percent of UMI counts mapped to mitochondrial genes.\n.obs[\"n_counts\"] - Number of UMIs detected in the cell\n.obs[\"n_genes\"] - Number of genes detected in the cell\n.obs[\"size_factors\"] - The estimated size factor for the cell. See OSCA Ch. 7 - Normalization\n.obs[\"phase\"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle\n.obs[\"leiden_final\"] -\n.obs[\"atac_ann\"] - The cell type annotation of the cell from the joint ATAC data\n.obs[\"cell_type\"] - The cell type annotation of the cells from the GEX data\n.obs[\"pseudotime_order_GEX\"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data.\n.obs[\"batch\"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1.\nFor more info on how the QC metrics were calculated, consult URL.calculate_qc_metrics\n\nGene expression feature metadata\nThe GEX URL DataFrames have the following columns:\n\n.URL - Ensembl Gene Names for each gene\n.var[\"gene_ids\"] - Ensembl Stable IDs used to uniquely track genes whose Gene Names may change over time.\n.var[\"feature_types\"] - Denotes the each feature as a gene expression feature. Should be GEX for all genes\n.var[\"genome\"] - The Genome Assembly used for read mapping.\n.var[\"n_cells-[batch]\"] - The number of cells in [batch] in which the gene was detected.\n.var[\"highly_variable-[batch]\"] - Whether the gene was determined to be highly variable in [batch]\nATAC observation metadata\nThe ATAC URL DataFrames have the following columns:\n\n.URL - The cell barcode for that observation with the batch label appended.\n.obs[\"nCount_peaks\"] - The number of peaks detected in the cell.\n.obs[\"atac_fragments\"] - Number of UMI counts in the cell (both in and not in peaks)\n.obs[\"reads_in_peaks_frac\"] - Fraction of UMIs in peaks\n.obs[\"blacklist_fraction\"] - Fraction of UMIs in Encode Blacklisted regions\n.obs[\"nucleosome_signal\"] - The nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it\n.obs[\"phase\"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle\n.obs[\"leiden_final\"] -\n.obs[\"rna_ann\"] - The cell type annotation of the cell from the joint RNA data\n.obs[\"cell_type\"] - The cell type annotation of the cells from the ATAC data\n.obs[\"pseudotime_order_ATAC\"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data.\n.obs[\"batch\"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1.\nFor more info on how the QC metrics were calculated, consult the Signac documentation.\n\nATAC feature metadata\nThe ATAC URL DataFrames have the following columns:\n\n.URL - Genomic coordinates for each ATAC peak that are directly related to the reference genome, and include the chromosome name*, start position, and end position in the following format: chr1-1234570-1234870.\n.var[\"feature_types\"] - Denotes the each feature as a gene expression feature. Should be ATAC for all peaks\n.var[\"n_cells-[batch]\"] - The number of cells in [batch] in which the peak was detected.\n*For the curious, chromosome names like KI270726.1 represent scaffold that are either unlocalized or unplaced (see Genome Assemblies from Ensembl)\n\nThere is also information about the observations in the .obs DataFrame of each AnnData object.", "## Potential biases\nCell type identification and doublet removal were already performed. Donors varied by age (22 - 40), sex, and ethnicity\n(details in the associated datasheet).", "#### Who are the annotators?\n\n\n\nBurkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM\n\nURL" ]
[ "TAGS\n#task_categories-feature-extraction #license-mit #biology #medical #region-us \n", "# Dataset Card for single-cell multiome from bone marrow\n\n\n\nSingle-cell multiomics data collected from bone marrow mononuclear cells of 12 healthy human donors.", "## Dataset Details\nMultimodal data as a basis for benchmarking\n\"Developing machine learning methods for biological systems is complicated by the difficulty of obtaining ground truth. Typically, machine learning tasks rely on manual annotation (as in images or natural language queries), dynamic measurements (as in longitudinal health records or weather), or multimodal measurement (as in translation or text-to-speech). However, this is more complicated in the context of single-cell biology.\nWith single-cell data, annotation isn’t feasible. The data is noisy and not fully understood with descriptions of cell types evolving rapidly. Similarly, longitudinal measurement of all the RNA in a cell isn’t possible because the current measurement technologies involve destroying the cell. However, with multimodal single-cell data, we can now directly observe two layers of genetic information in the same cells. This provides an opportunity to use the fact these two sets of data were observed co-occurring in the same cells as ground truth. This is akin to the way that access to the same sentiment expressed in two languages provides ground truth for machine translation.\nHowever, as these technologies are relatively new, most publicly available datasets are designed for exploration, not benchmarking. To set up a competition for multimodal single-cell data integration, we set out to create a fit-for-purpose benchmarking dataset.\"", "### Dataset Description\nThe study design is as follows:\n\nMultiome\nSite 1 - Donors 1, 2, 3\nSite 2 - Donors 1, 4, 5\nSite 3 - Donors 3, 6, 7, 10\nSite 4 - Donors 1, 8, 9\n\n- Curated by: Burkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM\n- License: MIT", "### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: URL", "## Uses\n\n\nChallenges included modality prediction, matching profiles from different modalities, and learning a joint embedding from multiple modalities.", "## Dataset Structure\n\n\n\nThe training data is accessible in an AnnData h5ad file. More information can be found on AnnData objects here. You can load these files is to use the AnnData.read_h5ad() function.\nThe dataset was designed with a nested batch layout such that some donor samples were measured at multiple sites with some donors measured at a single site.", "## Dataset Creation\nJoint profiling of single-nucleus RNA and chromatin accessibility using the 10X Genomics Single Cell Multiome ATAC + Gene Expression Kit", "#### Data Collection and Processing\n\n\n\nTo facilitate exploring the data, each dataset has been preprocessed to remove low quality cells and doublets. The following sections detail this process for each data modality.\n\nPreprocessing of gene expression (GEX)\nIn this dataset, gene expression was measured using 3’ capture of nuclear RNA as described in the 10X Multiome Product Guide. Note, not all RNA is found in the nucleus. Comparisons of nuclear and cytosolic RNA have been previously reported (e.g. Bakken 2018; Abdelmoez 2018) as have comparisons of single-nucleus and single-cell RNA sequencing (Lake 2017).\n\nFor gene expression data, cells were filtered based on mitochondrial content, UMI counts per cell, and genes detected per cell. Size factors were then calculated using scran and stored in URL[\"size_factors\"].\n\nCounts were then normalized per cell by divided the UMI counts by the size factors. Original counts are stored in URL[\"counts\"]. The size factor normalized counts are stored in adata.X.\n\nFinally, normalized counts are log1p transformed. These normalized counts are stores in URL[\"log_norm\"].\n\nMore information about best practices for single-cell analysis can be found here.\n\nPreprocessing of ATAC\nThe chromatin accessibility data acquired by ATAC-seq as part of the 10X Multiome protocol was processed using Signac. Quality control, dimensionality reduction and translating peaks to gene activity scores was performed using Signac, following the authors’ instructions. After loading the peak-by-cell matrix, counts were binarized to only represent an accessible versus non-accessible state of each region. Cells were then filtered based on 5 quality control metrics comprising the total number of fragments, the enrichment of fragments detected at transcription start sites (TSS), the fraction of fragments in peak regions compared to peak-flanking regions, the fraction of peaks blacklisted by the ENCODE consortium, and the nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it.\n\nSince ATAC data is sparser than gene expression data, peaks were included if they were accessible in at least 15 cells.\n\nFinally, the data was binarized by setting all values >0 to 1 and stored in adata.X. Raw UMI counts for each peak can be found in URL[\"counts\"].\n\nPreprocessing of protein abundance (ADT)\nThe protein data was measured using the TotalSeq™-B Human Universal Cocktail, V1.0 of 134 cell surface markers and 6 isotype controls. The isotype controls are stored in URL[\"isotype_controls\"]. These controls do not target any human proteins and their expression should be considered background.\n\nThe ADT protein measurements were run through quality control based on the total number of ADTs (ranging from 1100-1200 to 24000 across samples), the number of proteins captured in each cell (with a lower limit of 80) and the ADT count of the 6 isotype controls summed up in each cell (ranging from 1 to 100).\n\nSince the total number of captured ADTs is limited, absolute ADT counts appear to be lower if highly abundant proteins are present. To account for this effect, normalization was performed using the centered log ratio (CLR) transformation. CLR counts are stored in adata.X and the raw counts are stored in URL[\"counts\"].", "#### Annotation process\n\n\nMetadata\nMore information about the features are available in the .var and .obs DataFrames of each object.\n\nGene expression observation metadata\nThe GEX adata objects have the following columns:\n\n.URL - The cell barcode for that observation with the batch label appended.\n.obs[\"n_genes_by_counts\"] - The number of genes with at least 1 count in a cell.\n.obs[\"pct_counts_mt\"] - Percent of UMI counts mapped to mitochondrial genes.\n.obs[\"n_counts\"] - Number of UMIs detected in the cell\n.obs[\"n_genes\"] - Number of genes detected in the cell\n.obs[\"size_factors\"] - The estimated size factor for the cell. See OSCA Ch. 7 - Normalization\n.obs[\"phase\"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle\n.obs[\"leiden_final\"] -\n.obs[\"atac_ann\"] - The cell type annotation of the cell from the joint ATAC data\n.obs[\"cell_type\"] - The cell type annotation of the cells from the GEX data\n.obs[\"pseudotime_order_GEX\"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data.\n.obs[\"batch\"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1.\nFor more info on how the QC metrics were calculated, consult URL.calculate_qc_metrics\n\nGene expression feature metadata\nThe GEX URL DataFrames have the following columns:\n\n.URL - Ensembl Gene Names for each gene\n.var[\"gene_ids\"] - Ensembl Stable IDs used to uniquely track genes whose Gene Names may change over time.\n.var[\"feature_types\"] - Denotes the each feature as a gene expression feature. Should be GEX for all genes\n.var[\"genome\"] - The Genome Assembly used for read mapping.\n.var[\"n_cells-[batch]\"] - The number of cells in [batch] in which the gene was detected.\n.var[\"highly_variable-[batch]\"] - Whether the gene was determined to be highly variable in [batch]\nATAC observation metadata\nThe ATAC URL DataFrames have the following columns:\n\n.URL - The cell barcode for that observation with the batch label appended.\n.obs[\"nCount_peaks\"] - The number of peaks detected in the cell.\n.obs[\"atac_fragments\"] - Number of UMI counts in the cell (both in and not in peaks)\n.obs[\"reads_in_peaks_frac\"] - Fraction of UMIs in peaks\n.obs[\"blacklist_fraction\"] - Fraction of UMIs in Encode Blacklisted regions\n.obs[\"nucleosome_signal\"] - The nucleosome signal, which describes the length distribution of fragments which is expected to follow the length of DNA required span across one nucleosome or multiples of it\n.obs[\"phase\"] - The cell cycle phase for each cell as calculated by URL.score_genes_cell_cycle\n.obs[\"leiden_final\"] -\n.obs[\"rna_ann\"] - The cell type annotation of the cell from the joint RNA data\n.obs[\"cell_type\"] - The cell type annotation of the cells from the ATAC data\n.obs[\"pseudotime_order_ATAC\"] - The diffusion pseudotime annotation for the developmental trajectories annotated in the data.\n.obs[\"batch\"] - The batch from which the cell was sampled. Format is s1d1 for Site 1 Donor 1.\nFor more info on how the QC metrics were calculated, consult the Signac documentation.\n\nATAC feature metadata\nThe ATAC URL DataFrames have the following columns:\n\n.URL - Genomic coordinates for each ATAC peak that are directly related to the reference genome, and include the chromosome name*, start position, and end position in the following format: chr1-1234570-1234870.\n.var[\"feature_types\"] - Denotes the each feature as a gene expression feature. Should be ATAC for all peaks\n.var[\"n_cells-[batch]\"] - The number of cells in [batch] in which the peak was detected.\n*For the curious, chromosome names like KI270726.1 represent scaffold that are either unlocalized or unplaced (see Genome Assemblies from Ensembl)\n\nThere is also information about the observations in the .obs DataFrame of each AnnData object.", "## Potential biases\nCell type identification and doublet removal were already performed. Donors varied by age (22 - 40), sex, and ethnicity\n(details in the associated datasheet).", "#### Who are the annotators?\n\n\n\nBurkhardt DB, Lücken MD, Lance C, Cannoodt R, Pisco AO, Krishnaswamy S, Theis FJ, Bloom JM\n\nURL" ]
414145f332e8a8c3d13f45f531eab6c5d9838016
# Dataset Card for Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ConvexAI/Seraphim-8x10.7B-bf16](https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T12:17:24.179405](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16/blob/main/results_2024-01-21T12-17-24.179405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6652967970541726, "acc_stderr": 0.03151994892831824, "acc_norm": 0.6662016910120943, "acc_norm_stderr": 0.0321599041095528, "mc1": 0.5618115055079559, "mc1_stderr": 0.017369236164404417, "mc2": 0.7077444338481541, "mc2_stderr": 0.01511580206193018 }, "harness|arc:challenge|25": { "acc": 0.6808873720136519, "acc_stderr": 0.013621696119173307, "acc_norm": 0.7098976109215017, "acc_norm_stderr": 0.013261573677520764 }, "harness|hellaswag|10": { "acc": 0.7123083051185023, "acc_stderr": 0.004517614647703243, "acc_norm": 0.8871738697470624, "acc_norm_stderr": 0.0031573355082588515 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.44, "acc_stderr": 0.0498887651569859, "acc_norm": 0.44, "acc_norm_stderr": 0.0498887651569859 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361073, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361073 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.04461960433384741, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8125, "acc_stderr": 0.032639560491693344, "acc_norm": 0.8125, "acc_norm_stderr": 0.032639560491693344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816508, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.048580835742663454, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.048580835742663454 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.03196758697835363, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.03196758697835363 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6206896551724138, "acc_stderr": 0.040434618619167466, "acc_norm": 0.6206896551724138, "acc_norm_stderr": 0.040434618619167466 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47883597883597884, "acc_stderr": 0.025728230952130733, "acc_norm": 0.47883597883597884, "acc_norm_stderr": 0.025728230952130733 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8064516129032258, "acc_stderr": 0.022475258525536057, "acc_norm": 0.8064516129032258, "acc_norm_stderr": 0.022475258525536057 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8838383838383839, "acc_stderr": 0.022828881775249377, "acc_norm": 0.8838383838383839, "acc_norm_stderr": 0.022828881775249377 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.02385479568097113, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.02385479568097113 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465073, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465073 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7058823529411765, "acc_stderr": 0.029597329730978086, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.029597329730978086 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092448, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092448 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.024152225962801584, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.024152225962801584 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8607594936708861, "acc_stderr": 0.022535526352692705, "acc_norm": 0.8607594936708861, "acc_norm_stderr": 0.022535526352692705 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.039153454088478354, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.039153454088478354 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.039166677628225836, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.039166677628225836 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077823, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077823 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8045977011494253, "acc_stderr": 0.014179171373424383, "acc_norm": 0.8045977011494253, "acc_norm_stderr": 0.014179171373424383 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.02344582627654554, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.02344582627654554 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3653631284916201, "acc_stderr": 0.016104833880142295, "acc_norm": 0.3653631284916201, "acc_norm_stderr": 0.016104833880142295 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.02440439492808787, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.02440439492808787 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.025218040373410622, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.025218040373410622 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7808641975308642, "acc_stderr": 0.02301670564026219, "acc_norm": 0.7808641975308642, "acc_norm_stderr": 0.02301670564026219 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5283687943262412, "acc_stderr": 0.029779450957303062, "acc_norm": 0.5283687943262412, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4895697522816167, "acc_stderr": 0.012767457253930647, "acc_norm": 0.4895697522816167, "acc_norm_stderr": 0.012767457253930647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7463235294117647, "acc_stderr": 0.026431329870789527, "acc_norm": 0.7463235294117647, "acc_norm_stderr": 0.026431329870789527 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6977124183006536, "acc_stderr": 0.018579232711113884, "acc_norm": 0.6977124183006536, "acc_norm_stderr": 0.018579232711113884 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.027979823538744546, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.027979823538744546 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.024845753212306053, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.024845753212306053 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.028762349126466108, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466108 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.038444531817709175, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.038444531817709175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.5618115055079559, "mc1_stderr": 0.017369236164404417, "mc2": 0.7077444338481541, "mc2_stderr": 0.01511580206193018 }, "harness|winogrande|5": { "acc": 0.8374112075769534, "acc_stderr": 0.010370455551343333 }, "harness|gsm8k|5": { "acc": 0.643669446550417, "acc_stderr": 0.013191685031357456 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16
[ "region:us" ]
2024-01-20T22:36:29+00:00
{"pretty_name": "Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Seraphim-8x10.7B-bf16](https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T12:17:24.179405](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16/blob/main/results_2024-01-21T12-17-24.179405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6652967970541726,\n \"acc_stderr\": 0.03151994892831824,\n \"acc_norm\": 0.6662016910120943,\n \"acc_norm_stderr\": 0.0321599041095528,\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7077444338481541,\n \"mc2_stderr\": 0.01511580206193018\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173307,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7123083051185023,\n \"acc_stderr\": 0.004517614647703243,\n \"acc_norm\": 0.8871738697470624,\n \"acc_norm_stderr\": 0.0031573355082588515\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097113,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097113\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077823,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077823\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.016104833880142295,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.016104833880142295\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n \"acc_stderr\": 0.012767457253930647,\n \"acc_norm\": 0.4895697522816167,\n \"acc_norm_stderr\": 0.012767457253930647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6977124183006536,\n \"acc_stderr\": 0.018579232711113884,\n \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.018579232711113884\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7077444338481541,\n \"mc2_stderr\": 0.01511580206193018\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343333\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.643669446550417,\n \"acc_stderr\": 0.013191685031357456\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|arc:challenge|25_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|gsm8k|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hellaswag|10_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T22-34-11.436862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["**/details_harness|winogrande|5_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["**/details_harness|winogrande|5_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T12-17-24.179405.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T22_34_11.436862", "path": ["results_2024-01-20T22-34-11.436862.parquet"]}, {"split": "2024_01_21T12_17_24.179405", "path": ["results_2024-01-21T12-17-24.179405.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T12-17-24.179405.parquet"]}]}]}
2024-01-21T12:20:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16 Dataset automatically created during the evaluation run of model ConvexAI/Seraphim-8x10.7B-bf16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T12:17:24.179405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Seraphim-8x10.7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:17:24.179405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Seraphim-8x10.7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:17:24.179405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d951053841ea24ad121c75f95926d5702fd4d060
# Dataset Card for "gsm8k_sympy_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tfshaman/gsm8k_sympy_v2
[ "region:us" ]
2024-01-20T22:59:39+00:00
{"dataset_info": {"features": [{"name": "gsm8k_id", "dtype": "int64"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "code_output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14906106, "num_examples": 4046}], "download_size": 5683964, "dataset_size": 14906106}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-20T23:16:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "gsm8k_sympy_v2" More Information needed
[ "# Dataset Card for \"gsm8k_sympy_v2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"gsm8k_sympy_v2\"\n\nMore Information needed" ]
05cf1f19acee3f34048cb8b1d1f9a80e3ba58c84
# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v2.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-20T22:59:03.772290](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v2.0/blob/main/results_2024-01-20T22-59-03.772290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6529630632334766, "acc_stderr": 0.0320866881299104, "acc_norm": 0.6520494030755, "acc_norm_stderr": 0.03275917888152689, "mc1": 0.576499388004896, "mc1_stderr": 0.01729742144853475, "mc2": 0.6976349185949319, "mc2_stderr": 0.015099062052694076 }, "harness|arc:challenge|25": { "acc": 0.7133105802047781, "acc_stderr": 0.013214986329274774, "acc_norm": 0.7337883959044369, "acc_norm_stderr": 0.012915774781523197 }, "harness|hellaswag|10": { "acc": 0.7220673172674766, "acc_stderr": 0.004470644845242894, "acc_norm": 0.888070105556662, "acc_norm_stderr": 0.003146358383260359 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736411, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736411 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944427, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250447, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250447 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.026558372502661916, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.026558372502661916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903336, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903336 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.023703099525258172, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.023703099525258172 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43575418994413406, "acc_stderr": 0.016583881958602394, "acc_norm": 0.43575418994413406, "acc_norm_stderr": 0.016583881958602394 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46936114732724904, "acc_stderr": 0.012746237711716634, "acc_norm": 0.46936114732724904, "acc_norm_stderr": 0.012746237711716634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.027833023871399673, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.027833023871399673 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.576499388004896, "mc1_stderr": 0.01729742144853475, "mc2": 0.6976349185949319, "mc2_stderr": 0.015099062052694076 }, "harness|winogrande|5": { "acc": 0.8382004735595896, "acc_stderr": 0.010350128010292404 }, "harness|gsm8k|5": { "acc": 0.7081122062168309, "acc_stderr": 0.012522795894420869 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v2.0
[ "region:us" ]
2024-01-20T23:01:19+00:00
{"pretty_name": "Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-20T22:59:03.772290](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-Instruct-v2.0/blob/main/results_2024-01-20T22-59-03.772290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6529630632334766,\n \"acc_stderr\": 0.0320866881299104,\n \"acc_norm\": 0.6520494030755,\n \"acc_norm_stderr\": 0.03275917888152689,\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976349185949319,\n \"mc2_stderr\": 0.015099062052694076\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274774,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523197\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7220673172674766,\n \"acc_stderr\": 0.004470644845242894,\n \"acc_norm\": 0.888070105556662,\n \"acc_norm_stderr\": 0.003146358383260359\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903336,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903336\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976349185949319,\n \"mc2_stderr\": 0.015099062052694076\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292404\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \"acc_stderr\": 0.012522795894420869\n }\n}\n```", "repo_url": "https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|arc:challenge|25_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|gsm8k|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hellaswag|10_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-20T22-59-03.772290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["**/details_harness|winogrande|5_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-20T22-59-03.772290.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_20T22_59_03.772290", "path": ["results_2024-01-20T22-59-03.772290.parquet"]}, {"split": "latest", "path": ["results_2024-01-20T22-59-03.772290.parquet"]}]}]}
2024-01-20T23:01:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0 Dataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-20T22:59:03.772290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0\n\n\n\nDataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-20T22:59:03.772290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0\n\n\n\nDataset automatically created during the evaluation run of model zhengr/MixTAO-7Bx2-MoE-Instruct-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-20T22:59:03.772290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d6f33f8687ac1d3dfdcfbcfadc3e0fc193b67479
# Dataset Card for Evaluation run of CultriX/CultriX-MoE-BF16 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/CultriX-MoE-BF16](https://huggingface.co/CultriX/CultriX-MoE-BF16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__CultriX-MoE-BF16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T00:00:58.593995](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CultriX-MoE-BF16/blob/main/results_2024-01-21T00-00-58.593995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6561191463216446, "acc_stderr": 0.0319488590579618, "acc_norm": 0.6563583741296464, "acc_norm_stderr": 0.032603919592091336, "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168103, "mc2": 0.6347153460815503, "mc2_stderr": 0.015215857179912357 }, "harness|arc:challenge|25": { "acc": 0.6561433447098977, "acc_stderr": 0.013880644570156217, "acc_norm": 0.689419795221843, "acc_norm_stderr": 0.013522292098053067 }, "harness|hellaswag|10": { "acc": 0.6906990639314877, "acc_stderr": 0.004612608206670411, "acc_norm": 0.8696474805815575, "acc_norm_stderr": 0.0033600276617653984 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.0253795249107784, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.0253795249107784 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669235, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669235 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.02531049537694486, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.02531049537694486 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.01326534626132379, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.01326534626132379 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4402234636871508, "acc_stderr": 0.01660256461504994, "acc_norm": 0.4402234636871508, "acc_norm_stderr": 0.01660256461504994 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291296, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.025196929874827072, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.025196929874827072 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168103, "mc2": 0.6347153460815503, "mc2_stderr": 0.015215857179912357 }, "harness|winogrande|5": { "acc": 0.8105761641673244, "acc_stderr": 0.011012790432989243 }, "harness|gsm8k|5": { "acc": 0.6997725549658832, "acc_stderr": 0.012625423152283032 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__CultriX-MoE-BF16
[ "region:us" ]
2024-01-21T00:03:15+00:00
{"pretty_name": "Evaluation run of CultriX/CultriX-MoE-BF16", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/CultriX-MoE-BF16](https://huggingface.co/CultriX/CultriX-MoE-BF16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__CultriX-MoE-BF16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T00:00:58.593995](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CultriX-MoE-BF16/blob/main/results_2024-01-21T00-00-58.593995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6561191463216446,\n \"acc_stderr\": 0.0319488590579618,\n \"acc_norm\": 0.6563583741296464,\n \"acc_norm_stderr\": 0.032603919592091336,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6347153460815503,\n \"mc2_stderr\": 0.015215857179912357\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156217,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053067\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6906990639314877,\n \"acc_stderr\": 0.004612608206670411,\n \"acc_norm\": 0.8696474805815575,\n \"acc_norm_stderr\": 0.0033600276617653984\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.01326534626132379,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.01326534626132379\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6347153460815503,\n \"mc2_stderr\": 0.015215857179912357\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \"acc_stderr\": 0.012625423152283032\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/CultriX-MoE-BF16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-00-58.593995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["**/details_harness|winogrande|5_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T00-00-58.593995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T00_00_58.593995", "path": ["results_2024-01-21T00-00-58.593995.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T00-00-58.593995.parquet"]}]}]}
2024-01-21T00:03:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/CultriX-MoE-BF16 Dataset automatically created during the evaluation run of model CultriX/CultriX-MoE-BF16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T00:00:58.593995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/CultriX-MoE-BF16\n\n\n\nDataset automatically created during the evaluation run of model CultriX/CultriX-MoE-BF16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:00:58.593995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/CultriX-MoE-BF16\n\n\n\nDataset automatically created during the evaluation run of model CultriX/CultriX-MoE-BF16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:00:58.593995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
93706560a96925cf6d9b9d6f4581a51bcc889896
# Dataset Card for Evaluation run of CultriX/CultriX-MoE-Model <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/CultriX-MoE-Model](https://huggingface.co/CultriX/CultriX-MoE-Model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__CultriX-MoE-Model", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T00:41:32.165896](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CultriX-MoE-Model/blob/main/results_2024-01-21T00-41-32.165896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6528639307439001, "acc_stderr": 0.03206966833785275, "acc_norm": 0.6541650124393522, "acc_norm_stderr": 0.032715966570260886, "mc1": 0.5287637698898409, "mc1_stderr": 0.017474513848525518, "mc2": 0.680406671341865, "mc2_stderr": 0.015098028399460912 }, "harness|arc:challenge|25": { "acc": 0.6689419795221843, "acc_stderr": 0.013752062419817837, "acc_norm": 0.7005119453924915, "acc_norm_stderr": 0.013385021637313577 }, "harness|hellaswag|10": { "acc": 0.7028480382393946, "acc_stderr": 0.0045607003179278195, "acc_norm": 0.8722366062537343, "acc_norm_stderr": 0.0033314391934060406 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754407, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754407 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629475, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.022755204959542946, "acc_norm": 0.8, "acc_norm_stderr": 0.022755204959542946 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6871794871794872, "acc_stderr": 0.02350757902064536, "acc_norm": 0.6871794871794872, "acc_norm_stderr": 0.02350757902064536 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538271, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.0251956584289318, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.0251956584289318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601453, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601453 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037182, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037182 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.03322015795776741, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.03322015795776741 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406974, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406974 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712995, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712995 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4621903520208605, "acc_stderr": 0.01273367188034251, "acc_norm": 0.4621903520208605, "acc_norm_stderr": 0.01273367188034251 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.02841820861940676, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.02841820861940676 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5287637698898409, "mc1_stderr": 0.017474513848525518, "mc2": 0.680406671341865, "mc2_stderr": 0.015098028399460912 }, "harness|winogrande|5": { "acc": 0.8089976322020521, "acc_stderr": 0.011047808761510427 }, "harness|gsm8k|5": { "acc": 0.6209249431387415, "acc_stderr": 0.013363630295088356 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__CultriX-MoE-Model
[ "region:us" ]
2024-01-21T00:43:46+00:00
{"pretty_name": "Evaluation run of CultriX/CultriX-MoE-Model", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/CultriX-MoE-Model](https://huggingface.co/CultriX/CultriX-MoE-Model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__CultriX-MoE-Model\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T00:41:32.165896](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__CultriX-MoE-Model/blob/main/results_2024-01-21T00-41-32.165896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528639307439001,\n \"acc_stderr\": 0.03206966833785275,\n \"acc_norm\": 0.6541650124393522,\n \"acc_norm_stderr\": 0.032715966570260886,\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.680406671341865,\n \"mc2_stderr\": 0.015098028399460912\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6689419795221843,\n \"acc_stderr\": 0.013752062419817837,\n \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.013385021637313577\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7028480382393946,\n \"acc_stderr\": 0.0045607003179278195,\n \"acc_norm\": 0.8722366062537343,\n \"acc_norm_stderr\": 0.0033314391934060406\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601453,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601453\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.01273367188034251,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.01273367188034251\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.680406671341865,\n \"mc2_stderr\": 0.015098028399460912\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6209249431387415,\n \"acc_stderr\": 0.013363630295088356\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/CultriX-MoE-Model", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-41-32.165896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["**/details_harness|winogrande|5_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T00-41-32.165896.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T00_41_32.165896", "path": ["results_2024-01-21T00-41-32.165896.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T00-41-32.165896.parquet"]}]}]}
2024-01-21T00:44:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/CultriX-MoE-Model Dataset automatically created during the evaluation run of model CultriX/CultriX-MoE-Model on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T00:41:32.165896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/CultriX-MoE-Model\n\n\n\nDataset automatically created during the evaluation run of model CultriX/CultriX-MoE-Model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:41:32.165896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/CultriX-MoE-Model\n\n\n\nDataset automatically created during the evaluation run of model CultriX/CultriX-MoE-Model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:41:32.165896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ed8f65bbfa011b05f4c11d8f765b02f55afa332e
# Dataset Card for Evaluation run of Steelskull/Umbra-MoE-4x10.7 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Steelskull/Umbra-MoE-4x10.7](https://huggingface.co/Steelskull/Umbra-MoE-4x10.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T00:54:53.184339](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7/blob/main/results_2024-01-21T00-54-53.184339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6674032750190299, "acc_stderr": 0.0314926889496487, "acc_norm": 0.6684896093314947, "acc_norm_stderr": 0.03213090427046816, "mc1": 0.5324357405140759, "mc1_stderr": 0.017466632149577613, "mc2": 0.6782098863716366, "mc2_stderr": 0.015273304296026847 }, "harness|arc:challenge|25": { "acc": 0.6715017064846417, "acc_stderr": 0.013724978465537302, "acc_norm": 0.7030716723549488, "acc_norm_stderr": 0.013352025976725228 }, "harness|hellaswag|10": { "acc": 0.7002589125672177, "acc_stderr": 0.0045720816569656455, "acc_norm": 0.8781119298944433, "acc_norm_stderr": 0.0032648787375868854 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.04461960433384741, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.028637235639800886, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.028637235639800886 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.048108401480826346, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.048108401480826346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909281, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.625531914893617, "acc_stderr": 0.03163910665367291, "acc_norm": 0.625531914893617, "acc_norm_stderr": 0.03163910665367291 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.040703290137070705, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.040703290137070705 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47883597883597884, "acc_stderr": 0.025728230952130726, "acc_norm": 0.47883597883597884, "acc_norm_stderr": 0.025728230952130726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8129032258064516, "acc_stderr": 0.022185710092252252, "acc_norm": 0.8129032258064516, "acc_norm_stderr": 0.022185710092252252 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8787878787878788, "acc_stderr": 0.023253157951942084, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.023253157951942084 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603347, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603347 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.024121125416941183, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.024121125416941183 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857403, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857403 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7352941176470589, "acc_stderr": 0.028657491285071987, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.028657491285071987 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590172, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590172 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5787037037037037, "acc_stderr": 0.033674621388960775, "acc_norm": 0.5787037037037037, "acc_norm_stderr": 0.033674621388960775 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.024152225962801584, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.024152225962801584 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8565400843881856, "acc_stderr": 0.022818291821017012, "acc_norm": 0.8565400843881856, "acc_norm_stderr": 0.022818291821017012 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.03114679648297246, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.03114679648297246 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596915, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596915 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990946, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990946 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.01358661921990333, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.01358661921990333 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4044692737430168, "acc_stderr": 0.01641444091729315, "acc_norm": 0.4044692737430168, "acc_norm_stderr": 0.01641444091729315 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7647058823529411, "acc_stderr": 0.024288619466046102, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.024288619466046102 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.0254942593506949, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.0254942593506949 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7839506172839507, "acc_stderr": 0.022899162918445806, "acc_norm": 0.7839506172839507, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.49282920469361147, "acc_stderr": 0.012768922739553304, "acc_norm": 0.49282920469361147, "acc_norm_stderr": 0.012768922739553304 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7536764705882353, "acc_stderr": 0.02617343857052, "acc_norm": 0.7536764705882353, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.696078431372549, "acc_stderr": 0.01860755213127983, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.01860755213127983 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960234, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960234 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.02619392354445412, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.02619392354445412 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.030944459778533207, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.030944459778533207 }, "harness|truthfulqa:mc|0": { "mc1": 0.5324357405140759, "mc1_stderr": 0.017466632149577613, "mc2": 0.6782098863716366, "mc2_stderr": 0.015273304296026847 }, "harness|winogrande|5": { "acc": 0.8326756116811366, "acc_stderr": 0.010490608806828075 }, "harness|gsm8k|5": { "acc": 0.6474601971190296, "acc_stderr": 0.013159909755930333 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7
[ "region:us" ]
2024-01-21T00:57:11+00:00
{"pretty_name": "Evaluation run of Steelskull/Umbra-MoE-4x10.7", "dataset_summary": "Dataset automatically created during the evaluation run of model [Steelskull/Umbra-MoE-4x10.7](https://huggingface.co/Steelskull/Umbra-MoE-4x10.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T00:54:53.184339](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7/blob/main/results_2024-01-21T00-54-53.184339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6674032750190299,\n \"acc_stderr\": 0.0314926889496487,\n \"acc_norm\": 0.6684896093314947,\n \"acc_norm_stderr\": 0.03213090427046816,\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6782098863716366,\n \"mc2_stderr\": 0.015273304296026847\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537302,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725228\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7002589125672177,\n \"acc_stderr\": 0.0045720816569656455,\n \"acc_norm\": 0.8781119298944433,\n \"acc_norm_stderr\": 0.0032648787375868854\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941183,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941183\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.028657491285071987,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.028657491285071987\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n \"acc_stderr\": 0.01641444091729315,\n \"acc_norm\": 0.4044692737430168,\n \"acc_norm_stderr\": 0.01641444091729315\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n \"acc_stderr\": 0.012768922739553304,\n \"acc_norm\": 0.49282920469361147,\n \"acc_norm_stderr\": 0.012768922739553304\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6782098863716366,\n \"mc2_stderr\": 0.015273304296026847\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \"acc_stderr\": 0.013159909755930333\n }\n}\n```", "repo_url": "https://huggingface.co/Steelskull/Umbra-MoE-4x10.7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["**/details_harness|winogrande|5_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T00-54-53.184339.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T00_54_53.184339", "path": ["results_2024-01-21T00-54-53.184339.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T00-54-53.184339.parquet"]}]}]}
2024-01-21T00:57:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Steelskull/Umbra-MoE-4x10.7 Dataset automatically created during the evaluation run of model Steelskull/Umbra-MoE-4x10.7 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T00:54:53.184339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Steelskull/Umbra-MoE-4x10.7\n\n\n\nDataset automatically created during the evaluation run of model Steelskull/Umbra-MoE-4x10.7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:54:53.184339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Steelskull/Umbra-MoE-4x10.7\n\n\n\nDataset automatically created during the evaluation run of model Steelskull/Umbra-MoE-4x10.7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T00:54:53.184339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9f8e98c9a8bf92707484c0855c833290a1525f33
This is data from the Amazon Armbench dataset (https://armbench.s3.amazonaws.com/index.html).
correll/armbench-segmentation-mix-object-tote
[ "license:cc-by-4.0", "region:us" ]
2024-01-21T01:05:28+00:00
{"license": "cc-by-4.0", "dataset_info": {"features": [{"name": "rgb", "dtype": "image"}, {"name": "mask", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 13895663120.768, "num_examples": 30992}], "download_size": 12376280750, "dataset_size": 13895663120.768}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-30T00:00:58+00:00
[]
[]
TAGS #license-cc-by-4.0 #region-us
This is data from the Amazon Armbench dataset (URL
[]
[ "TAGS\n#license-cc-by-4.0 #region-us \n" ]
84726c9ab7d3498990415280c695b170a4d34e9b
# Dataset Card for Evaluation run of freecs/Zero-7B-test-1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [freecs/Zero-7B-test-1](https://huggingface.co/freecs/Zero-7B-test-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_freecs__Zero-7B-test-1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T01:23:06.554598](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-1/blob/main/results_2024-01-21T01-23-06.554598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6323605845767026, "acc_stderr": 0.03255278125181369, "acc_norm": 0.6353706994655927, "acc_norm_stderr": 0.03320467391531255, "mc1": 0.4259485924112607, "mc1_stderr": 0.017310471904076544, "mc2": 0.589746084776469, "mc2_stderr": 0.015454907205144513 }, "harness|arc:challenge|25": { "acc": 0.6083617747440273, "acc_stderr": 0.014264122124938213, "acc_norm": 0.6612627986348123, "acc_norm_stderr": 0.013830568927974332 }, "harness|hellaswag|10": { "acc": 0.6447918741286597, "acc_stderr": 0.00477598265035592, "acc_norm": 0.8462457677753435, "acc_norm_stderr": 0.0035997580435468027 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6416184971098265, "acc_stderr": 0.036563436533531585, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.036563436533531585 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.41228070175438597, "acc_stderr": 0.046306532033665956, "acc_norm": 0.41228070175438597, "acc_norm_stderr": 0.046306532033665956 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138208, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6806451612903226, "acc_stderr": 0.02652270967466777, "acc_norm": 0.6806451612903226, "acc_norm_stderr": 0.02652270967466777 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721164, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721164 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758723, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758723 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6256410256410256, "acc_stderr": 0.024537591572830503, "acc_norm": 0.6256410256410256, "acc_norm_stderr": 0.024537591572830503 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131143, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131143 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886786, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.818348623853211, "acc_stderr": 0.016530617409266854, "acc_norm": 0.818348623853211, "acc_norm_stderr": 0.016530617409266854 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.02704462171947409, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.02704462171947409 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601464, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601464 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097652, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097652 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8160919540229885, "acc_stderr": 0.013853724170922526, "acc_norm": 0.8160919540229885, "acc_norm_stderr": 0.013853724170922526 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3642458100558659, "acc_stderr": 0.016094338768474596, "acc_norm": 0.3642458100558659, "acc_norm_stderr": 0.016094338768474596 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292452, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292452 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7222222222222222, "acc_stderr": 0.024922001168886335, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.024922001168886335 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46088657105606257, "acc_stderr": 0.012731102790504515, "acc_norm": 0.46088657105606257, "acc_norm_stderr": 0.012731102790504515 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687492, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687492 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458618, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653693, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653693 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333047, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.4259485924112607, "mc1_stderr": 0.017310471904076544, "mc2": 0.589746084776469, "mc2_stderr": 0.015454907205144513 }, "harness|winogrande|5": { "acc": 0.7963693764798737, "acc_stderr": 0.01131779878162692 }, "harness|gsm8k|5": { "acc": 0.5451099317664898, "acc_stderr": 0.0137163187717946 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_freecs__Zero-7B-test-1
[ "region:us" ]
2024-01-21T01:25:27+00:00
{"pretty_name": "Evaluation run of freecs/Zero-7B-test-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/Zero-7B-test-1](https://huggingface.co/freecs/Zero-7B-test-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__Zero-7B-test-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T01:23:06.554598](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-1/blob/main/results_2024-01-21T01-23-06.554598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6323605845767026,\n \"acc_stderr\": 0.03255278125181369,\n \"acc_norm\": 0.6353706994655927,\n \"acc_norm_stderr\": 0.03320467391531255,\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.017310471904076544,\n \"mc2\": 0.589746084776469,\n \"mc2_stderr\": 0.015454907205144513\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938213,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6447918741286597,\n \"acc_stderr\": 0.00477598265035592,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.0035997580435468027\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.02652270967466777,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.02652270967466777\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601464,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601464\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.017310471904076544,\n \"mc2\": 0.589746084776469,\n \"mc2_stderr\": 0.015454907205144513\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.01131779878162692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5451099317664898,\n \"acc_stderr\": 0.0137163187717946\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/Zero-7B-test-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|arc:challenge|25_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|gsm8k|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hellaswag|10_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T01-23-06.554598.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["**/details_harness|winogrande|5_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T01-23-06.554598.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T01_23_06.554598", "path": ["results_2024-01-21T01-23-06.554598.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T01-23-06.554598.parquet"]}]}]}
2024-01-21T01:25:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of freecs/Zero-7B-test-1 Dataset automatically created during the evaluation run of model freecs/Zero-7B-test-1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T01:23:06.554598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of freecs/Zero-7B-test-1\n\n\n\nDataset automatically created during the evaluation run of model freecs/Zero-7B-test-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T01:23:06.554598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of freecs/Zero-7B-test-1\n\n\n\nDataset automatically created during the evaluation run of model freecs/Zero-7B-test-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T01:23:06.554598(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a1eb6887c96f6585069259ba05fdda4ae3968167
# Dataset Card for Evaluation run of Ba2han/TinyOpenHermes-1.1B-4k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Ba2han/TinyOpenHermes-1.1B-4k](https://huggingface.co/Ba2han/TinyOpenHermes-1.1B-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Ba2han__TinyOpenHermes-1.1B-4k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T01:30:57.754034](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__TinyOpenHermes-1.1B-4k/blob/main/results_2024-01-21T01-30-57.754034.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26936710062529556, "acc_stderr": 0.031263567264242856, "acc_norm": 0.2711224560641922, "acc_norm_stderr": 0.03208061243582453, "mc1": 0.23378212974296206, "mc1_stderr": 0.01481619599193158, "mc2": 0.3732859557148343, "mc2_stderr": 0.014080021093470342 }, "harness|arc:challenge|25": { "acc": 0.31313993174061433, "acc_stderr": 0.013552671543623497, "acc_norm": 0.3361774744027304, "acc_norm_stderr": 0.01380485502620576 }, "harness|hellaswag|10": { "acc": 0.44373630750846443, "acc_stderr": 0.004958089432669985, "acc_norm": 0.5853415654252141, "acc_norm_stderr": 0.004916561213591294 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.21481481481481482, "acc_stderr": 0.035478541985608236, "acc_norm": 0.21481481481481482, "acc_norm_stderr": 0.035478541985608236 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.24342105263157895, "acc_stderr": 0.034923496688842384, "acc_norm": 0.24342105263157895, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2528301886792453, "acc_stderr": 0.026749899771241235, "acc_norm": 0.2528301886792453, "acc_norm_stderr": 0.026749899771241235 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.19653179190751446, "acc_stderr": 0.03029957466478815, "acc_norm": 0.19653179190751446, "acc_norm_stderr": 0.03029957466478815 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617748, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617748 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.22, "acc_stderr": 0.041633319989322674, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322674 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3021276595744681, "acc_stderr": 0.030017554471880554, "acc_norm": 0.3021276595744681, "acc_norm_stderr": 0.030017554471880554 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2, "acc_stderr": 0.0333333333333333, "acc_norm": 0.2, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2698412698412698, "acc_stderr": 0.02286083830923207, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.02286083830923207 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147125, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147125 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25161290322580643, "acc_stderr": 0.024685979286239966, "acc_norm": 0.25161290322580643, "acc_norm_stderr": 0.024685979286239966 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2660098522167488, "acc_stderr": 0.031089826002937523, "acc_norm": 0.2660098522167488, "acc_norm_stderr": 0.031089826002937523 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23636363636363636, "acc_stderr": 0.033175059300091805, "acc_norm": 0.23636363636363636, "acc_norm_stderr": 0.033175059300091805 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25252525252525254, "acc_stderr": 0.03095405547036592, "acc_norm": 0.25252525252525254, "acc_norm_stderr": 0.03095405547036592 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3626943005181347, "acc_stderr": 0.034697137917043715, "acc_norm": 0.3626943005181347, "acc_norm_stderr": 0.034697137917043715 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.28974358974358977, "acc_stderr": 0.023000628243687964, "acc_norm": 0.28974358974358977, "acc_norm_stderr": 0.023000628243687964 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.22962962962962963, "acc_stderr": 0.025644108639267624, "acc_norm": 0.22962962962962963, "acc_norm_stderr": 0.025644108639267624 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341937, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341937 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943342, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943342 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.25137614678899084, "acc_stderr": 0.018599206360287415, "acc_norm": 0.25137614678899084, "acc_norm_stderr": 0.018599206360287415 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.41203703703703703, "acc_stderr": 0.03356787758160835, "acc_norm": 0.41203703703703703, "acc_norm_stderr": 0.03356787758160835 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.27941176470588236, "acc_stderr": 0.03149328104507956, "acc_norm": 0.27941176470588236, "acc_norm_stderr": 0.03149328104507956 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.28270042194092826, "acc_stderr": 0.029312814153955934, "acc_norm": 0.28270042194092826, "acc_norm_stderr": 0.029312814153955934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3542600896860987, "acc_stderr": 0.032100621541349864, "acc_norm": 0.3542600896860987, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2366412213740458, "acc_stderr": 0.0372767357559692, "acc_norm": 0.2366412213740458, "acc_norm_stderr": 0.0372767357559692 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2222222222222222, "acc_stderr": 0.0401910747255735, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3312883435582822, "acc_stderr": 0.03697983910025588, "acc_norm": 0.3312883435582822, "acc_norm_stderr": 0.03697983910025588 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.1875, "acc_stderr": 0.0370468111477387, "acc_norm": 0.1875, "acc_norm_stderr": 0.0370468111477387 }, "harness|hendrycksTest-management|5": { "acc": 0.27184466019417475, "acc_stderr": 0.044052680241409216, "acc_norm": 0.27184466019417475, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004243, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004243 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2707535121328225, "acc_stderr": 0.01588988836256049, "acc_norm": 0.2707535121328225, "acc_norm_stderr": 0.01588988836256049 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.23410404624277456, "acc_stderr": 0.022797110278071145, "acc_norm": 0.23410404624277456, "acc_norm_stderr": 0.022797110278071145 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961452, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961452 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2222222222222222, "acc_stderr": 0.023805186524888146, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.023805186524888146 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3054662379421222, "acc_stderr": 0.026160584450140478, "acc_norm": 0.3054662379421222, "acc_norm_stderr": 0.026160584450140478 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590613, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590613 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.22294654498044328, "acc_stderr": 0.010630525747386077, "acc_norm": 0.22294654498044328, "acc_norm_stderr": 0.010630525747386077 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.22794117647058823, "acc_stderr": 0.025483081468029804, "acc_norm": 0.22794117647058823, "acc_norm_stderr": 0.025483081468029804 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2369281045751634, "acc_stderr": 0.017201662169789775, "acc_norm": 0.2369281045751634, "acc_norm_stderr": 0.017201662169789775 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3090909090909091, "acc_stderr": 0.044262946482000985, "acc_norm": 0.3090909090909091, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.19591836734693877, "acc_stderr": 0.025409301953225678, "acc_norm": 0.19591836734693877, "acc_norm_stderr": 0.025409301953225678 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.030769444967296024, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.030769444967296024 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.0357160923005348, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.0357160923005348 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2807017543859649, "acc_stderr": 0.034462962170884265, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.23378212974296206, "mc1_stderr": 0.01481619599193158, "mc2": 0.3732859557148343, "mc2_stderr": 0.014080021093470342 }, "harness|winogrande|5": { "acc": 0.5990528808208366, "acc_stderr": 0.01377397455494803 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225406 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Ba2han__TinyOpenHermes-1.1B-4k
[ "region:us" ]
2024-01-21T01:33:16+00:00
{"pretty_name": "Evaluation run of Ba2han/TinyOpenHermes-1.1B-4k", "dataset_summary": "Dataset automatically created during the evaluation run of model [Ba2han/TinyOpenHermes-1.1B-4k](https://huggingface.co/Ba2han/TinyOpenHermes-1.1B-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ba2han__TinyOpenHermes-1.1B-4k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T01:30:57.754034](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__TinyOpenHermes-1.1B-4k/blob/main/results_2024-01-21T01-30-57.754034.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26936710062529556,\n \"acc_stderr\": 0.031263567264242856,\n \"acc_norm\": 0.2711224560641922,\n \"acc_norm_stderr\": 0.03208061243582453,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3732859557148343,\n \"mc2_stderr\": 0.014080021093470342\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.31313993174061433,\n \"acc_stderr\": 0.013552671543623497,\n \"acc_norm\": 0.3361774744027304,\n \"acc_norm_stderr\": 0.01380485502620576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44373630750846443,\n \"acc_stderr\": 0.004958089432669985,\n \"acc_norm\": 0.5853415654252141,\n \"acc_norm_stderr\": 0.004916561213591294\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.035478541985608236,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.035478541985608236\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241235,\n \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241235\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.03029957466478815,\n \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.03029957466478815\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880554,\n \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880554\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239966,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239966\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.03095405547036592,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.03095405547036592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.034697137917043715,\n \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.034697137917043715\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.023000628243687964,\n \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.023000628243687964\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267624,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267624\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341937,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341937\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.03149328104507956,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.03149328104507956\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004243,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004243\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.01588988836256049,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.01588988836256049\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071145,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071145\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961452,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3054662379421222,\n \"acc_stderr\": 0.026160584450140478,\n \"acc_norm\": 0.3054662379421222,\n \"acc_norm_stderr\": 0.026160584450140478\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590613,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590613\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22294654498044328,\n \"acc_stderr\": 0.010630525747386077,\n \"acc_norm\": 0.22294654498044328,\n \"acc_norm_stderr\": 0.010630525747386077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789775,\n \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789775\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296024,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3732859557148343,\n \"mc2_stderr\": 0.014080021093470342\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5990528808208366,\n \"acc_stderr\": 0.01377397455494803\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225406\n }\n}\n```", "repo_url": "https://huggingface.co/Ba2han/TinyOpenHermes-1.1B-4k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|arc:challenge|25_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|gsm8k|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hellaswag|10_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T01-30-57.754034.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["**/details_harness|winogrande|5_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T01-30-57.754034.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T01_30_57.754034", "path": ["results_2024-01-21T01-30-57.754034.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T01-30-57.754034.parquet"]}]}]}
2024-01-21T01:33:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Ba2han/TinyOpenHermes-1.1B-4k Dataset automatically created during the evaluation run of model Ba2han/TinyOpenHermes-1.1B-4k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T01:30:57.754034(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Ba2han/TinyOpenHermes-1.1B-4k\n\n\n\nDataset automatically created during the evaluation run of model Ba2han/TinyOpenHermes-1.1B-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T01:30:57.754034(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Ba2han/TinyOpenHermes-1.1B-4k\n\n\n\nDataset automatically created during the evaluation run of model Ba2han/TinyOpenHermes-1.1B-4k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T01:30:57.754034(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1366b2095b70d3ebf1402ac6caa1a811b9c4cd1f
# Dataset Card for "espanol_dolly_alpaca_format_combined" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jtatman/espanol_dolly_alpaca_format_combined
[ "region:us" ]
2024-01-21T01:33:44+00:00
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 84673816, "num_examples": 97047}], "download_size": 53214883, "dataset_size": 84673816}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T01:34:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "espanol_dolly_alpaca_format_combined" More Information needed
[ "# Dataset Card for \"espanol_dolly_alpaca_format_combined\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"espanol_dolly_alpaca_format_combined\"\n\nMore Information needed" ]
dd9a159cf6d470622c9741214a7f26faf24718db
# Dataset Card for "speech_robust_bench" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mshah1/speech_robust_bench
[ "region:us" ]
2024-01-21T01:39:08+00:00
{"dataset_info": [{"config_name": "librispeech_asr-test.clean", "features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "text", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "None.0", "num_bytes": 367982506.42, "num_examples": 2620}, {"name": "gnoise.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "gnoise.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "gnoise.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "gnoise.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "rir.1", "num_bytes": 745300827.34, "num_examples": 2620}, {"name": "rir.2", "num_bytes": 747348827.34, "num_examples": 2620}, {"name": "rir.3", "num_bytes": 720596827.34, "num_examples": 2620}, {"name": "rir.4", "num_bytes": 721812827.34, "num_examples": 2620}, {"name": "speedup.1", "num_bytes": 498896619.34, "num_examples": 2620}, {"name": "speedup.2", "num_bytes": 415901075.34, "num_examples": 2620}, {"name": "speedup.3", "num_bytes": 356617835.34, "num_examples": 2620}, {"name": "speedup.4", "num_bytes": 312152811.34, "num_examples": 2620}, {"name": "slowdown.1", "num_bytes": 712320343.34, "num_examples": 2620}, {"name": "slowdown.2", "num_bytes": 830887339.34, "num_examples": 2620}, {"name": "slowdown.3", "num_bytes": 996880127.34, "num_examples": 2620}, {"name": "slowdown.4", "num_bytes": 1245871847.34, "num_examples": 2620}, {"name": "pitch_up.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_up.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_down.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_down.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_down.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_down.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "pitch_up.1", "num_bytes": 623392458.5, "num_examples": 2620}, {"name": "pitch_up.2", "num_bytes": 623392458.5, "num_examples": 2620}, {"name": "resample.1", "num_bytes": 623392535.34, "num_examples": 2620}, {"name": "resample.2", "num_bytes": 623392535.34, "num_examples": 2620}, {"name": "resample.3", "num_bytes": 623392579.34, "num_examples": 2620}, {"name": "resample.4", "num_bytes": 623392623.34, "num_examples": 2620}, {"name": "env_noise_esc50.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise_esc50.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise_esc50.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "env_noise_esc50.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "voice_conversion.4", "num_bytes": 799852214.5, "num_examples": 2620}, {"name": "voice_conversion.3", "num_bytes": 580185782.5, "num_examples": 2620}, {"name": "voice_conversion.1", "num_bytes": 589259446.5, "num_examples": 2620}, {"name": "voice_conversion.2", "num_bytes": 571175606.5, "num_examples": 2620}, {"name": "gain.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "gain.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "gain.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "echo.1", "num_bytes": 633872467.34, "num_examples": 2620}, {"name": "echo.2", "num_bytes": 644352467.34, "num_examples": 2620}, {"name": "echo.3", "num_bytes": 665312467.34, "num_examples": 2620}, {"name": "echo.4", "num_bytes": 707232467.34, "num_examples": 2620}, {"name": "phaser.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "phaser.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "phaser.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "tempo_up.1", "num_bytes": 498896595.34, "num_examples": 2620}, {"name": "tempo_up.2", "num_bytes": 415899351.34, "num_examples": 2620}, {"name": "tempo_up.3", "num_bytes": 356615595.34, "num_examples": 2620}, {"name": "tempo_up.4", "num_bytes": 312152811.34, "num_examples": 2620}, {"name": "tempo_down.1", "num_bytes": 712318083.34, "num_examples": 2620}, {"name": "tempo_down.2", "num_bytes": 830885583.34, "num_examples": 2620}, {"name": "tempo_down.3", "num_bytes": 996880103.34, "num_examples": 2620}, {"name": "tempo_down.4", "num_bytes": 1245871847.34, "num_examples": 2620}, {"name": "gain.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "phaser.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "lowpass.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "lowpass.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "lowpass.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "lowpass.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "highpass.1", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "highpass.2", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "highpass.3", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "highpass.4", "num_bytes": 623392467.34, "num_examples": 2620}, {"name": "voice_conversion_vctk.1", "num_bytes": 495165825.88, "num_examples": 2620}, {"name": "universal_adv.1", "num_bytes": 623392467.34, "num_examples": 2620}], "download_size": 44398674574, "dataset_size": 45287590077.71995}, {"config_name": "librispeech_asr-test.clean_pertEval_500_30", "features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "text", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}, {"name": "id", "dtype": "string"}, {"name": "pert_idx", "dtype": "int64"}], "splits": [{"name": "gnoise.1", "num_bytes": 3592401090.0, "num_examples": 15000}, {"name": "env_noise_esc50.1", "num_bytes": 3592401090.0, "num_examples": 15000}], "download_size": 7170899040, "dataset_size": 7184802180.0}, {"config_name": "multilingual_librispeech-spanish_test", "features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "text", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "None.0", "num_bytes": 596762288.01, "num_examples": 2385}, {"name": "gnoise.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gnoise.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gnoise.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gnoise.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "rir.1", "num_bytes": 1268493860.17, "num_examples": 2385}, {"name": "rir.2", "num_bytes": 1252109860.17, "num_examples": 2385}, {"name": "rir.3", "num_bytes": 1249517860.17, "num_examples": 2385}, {"name": "rir.4", "num_bytes": 1222893860.17, "num_examples": 2385}, {"name": "speedup.1", "num_bytes": 923001764.17, "num_examples": 2385}, {"name": "speedup.2", "num_bytes": 769347364.17, "num_examples": 2385}, {"name": "speedup.3", "num_bytes": 659593516.17, "num_examples": 2385}, {"name": "speedup.4", "num_bytes": 577275652.17, "num_examples": 2385}, {"name": "slowdown.1", "num_bytes": 1318119422.17, "num_examples": 2385}, {"name": "slowdown.2", "num_bytes": 1537627530.17, "num_examples": 2385}, {"name": "slowdown.3", "num_bytes": 1844938056.17, "num_examples": 2385}, {"name": "slowdown.4", "num_bytes": 2305906194.17, "num_examples": 2385}, {"name": "pitch_up.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_up.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_down.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_down.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_down.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_down.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "pitch_up.1", "num_bytes": 1153485821.72, "num_examples": 2385}, {"name": "pitch_up.2", "num_bytes": 1153485821.72, "num_examples": 2385}, {"name": "resample.2", "num_bytes": 1153485842.17, "num_examples": 2385}, {"name": "env_noise_esc50.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise_esc50.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise_esc50.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "env_noise_esc50.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gain.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gain.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gain.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "gain.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "echo.1", "num_bytes": 1163025830.17, "num_examples": 2385}, {"name": "echo.2", "num_bytes": 1172565830.17, "num_examples": 2385}, {"name": "echo.3", "num_bytes": 1191645830.17, "num_examples": 2385}, {"name": "echo.4", "num_bytes": 1229805830.17, "num_examples": 2385}, {"name": "tempo_up.1", "num_bytes": 923001758.17, "num_examples": 2385}, {"name": "tempo_up.2", "num_bytes": 769345632.17, "num_examples": 2385}, {"name": "tempo_up.3", "num_bytes": 659591372.17, "num_examples": 2385}, {"name": "tempo_up.4", "num_bytes": 577275652.17, "num_examples": 2385}, {"name": "tempo_down.1", "num_bytes": 1318117252.17, "num_examples": 2385}, {"name": "tempo_down.2", "num_bytes": 1537626028.17, "num_examples": 2385}, {"name": "tempo_down.3", "num_bytes": 1844938048.17, "num_examples": 2385}, {"name": "tempo_down.4", "num_bytes": 2305906194.17, "num_examples": 2385}, {"name": "phaser.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "phaser.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "phaser.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "phaser.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "resample.1", "num_bytes": 1153485840.17, "num_examples": 2385}, {"name": "resample.3", "num_bytes": 1153485850.17, "num_examples": 2385}, {"name": "resample.4", "num_bytes": 1153485882.17, "num_examples": 2385}, {"name": "lowpass.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "lowpass.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "lowpass.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "lowpass.4", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "highpass.1", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "highpass.2", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "highpass.3", "num_bytes": 1153485830.17, "num_examples": 2385}, {"name": "highpass.4", "num_bytes": 1153485830.17, "num_examples": 2385}], "download_size": 81619739964, "dataset_size": 76357865769.98993}, {"config_name": "multilingual_librispeech-spanish_test_pertEval_500_30", "features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "text", "dtype": "string"}, {"name": "speaker_id", "dtype": "int64"}, {"name": "chapter_id", "dtype": "int64"}, {"name": "id", "dtype": "string"}, {"name": "pert_idx", "dtype": "int64"}], "splits": [{"name": "gnoise.1", "num_bytes": 7341021960.0, "num_examples": 15000}, {"name": "env_noise_esc50.1", "num_bytes": 7341021960.0, "num_examples": 15000}], "download_size": 14645523867, "dataset_size": 14682043920.0}], "configs": [{"config_name": "librispeech_asr-test.clean", "data_files": [{"split": "None.0", "path": "librispeech_asr-test.clean/None.0-*"}, {"split": "gnoise.1", "path": "librispeech_asr-test.clean/gnoise.1-*"}, {"split": "gnoise.2", "path": "librispeech_asr-test.clean/gnoise.2-*"}, {"split": "gnoise.3", "path": "librispeech_asr-test.clean/gnoise.3-*"}, {"split": "gnoise.4", "path": "librispeech_asr-test.clean/gnoise.4-*"}, {"split": "env_noise.1", "path": "librispeech_asr-test.clean/env_noise.1-*"}, {"split": "env_noise.2", "path": "librispeech_asr-test.clean/env_noise.2-*"}, {"split": "env_noise.3", "path": "librispeech_asr-test.clean/env_noise.3-*"}, {"split": "env_noise.4", "path": "librispeech_asr-test.clean/env_noise.4-*"}, {"split": "rir.1", "path": "librispeech_asr-test.clean/rir.1-*"}, {"split": "rir.2", "path": "librispeech_asr-test.clean/rir.2-*"}, {"split": "rir.3", "path": "librispeech_asr-test.clean/rir.3-*"}, {"split": "rir.4", "path": "librispeech_asr-test.clean/rir.4-*"}, {"split": "speedup.1", "path": "librispeech_asr-test.clean/speedup.1-*"}, {"split": "speedup.2", "path": "librispeech_asr-test.clean/speedup.2-*"}, {"split": "speedup.3", "path": "librispeech_asr-test.clean/speedup.3-*"}, {"split": "speedup.4", "path": "librispeech_asr-test.clean/speedup.4-*"}, {"split": "slowdown.1", "path": "librispeech_asr-test.clean/slowdown.1-*"}, {"split": "slowdown.2", "path": "librispeech_asr-test.clean/slowdown.2-*"}, {"split": "slowdown.3", "path": "librispeech_asr-test.clean/slowdown.3-*"}, {"split": "slowdown.4", "path": "librispeech_asr-test.clean/slowdown.4-*"}, {"split": "pitch_up.3", "path": "librispeech_asr-test.clean/pitch_up.3-*"}, {"split": "pitch_up.4", "path": "librispeech_asr-test.clean/pitch_up.4-*"}, {"split": "pitch_down.1", "path": "librispeech_asr-test.clean/pitch_down.1-*"}, {"split": "pitch_down.2", "path": "librispeech_asr-test.clean/pitch_down.2-*"}, {"split": "pitch_down.3", "path": "librispeech_asr-test.clean/pitch_down.3-*"}, {"split": "pitch_down.4", "path": "librispeech_asr-test.clean/pitch_down.4-*"}, {"split": "pitch_up.1", "path": "librispeech_asr-test.clean/pitch_up.1-*"}, {"split": "pitch_up.2", "path": "librispeech_asr-test.clean/pitch_up.2-*"}, {"split": "resample.1", "path": "librispeech_asr-test.clean/resample.1-*"}, {"split": "resample.2", "path": "librispeech_asr-test.clean/resample.2-*"}, {"split": "resample.3", "path": "librispeech_asr-test.clean/resample.3-*"}, {"split": "resample.4", "path": "librispeech_asr-test.clean/resample.4-*"}, {"split": "env_noise_esc50.1", "path": "librispeech_asr-test.clean/env_noise_esc50.1-*"}, {"split": "env_noise_esc50.2", "path": "librispeech_asr-test.clean/env_noise_esc50.2-*"}, {"split": "env_noise_esc50.3", "path": "librispeech_asr-test.clean/env_noise_esc50.3-*"}, {"split": "env_noise_esc50.4", "path": "librispeech_asr-test.clean/env_noise_esc50.4-*"}, {"split": "voice_conversion.4", "path": "librispeech_asr-test.clean/voice_conversion.4-*"}, {"split": "voice_conversion.3", "path": "librispeech_asr-test.clean/voice_conversion.3-*"}, {"split": "voice_conversion.1", "path": "librispeech_asr-test.clean/voice_conversion.1-*"}, {"split": "voice_conversion.2", "path": "librispeech_asr-test.clean/voice_conversion.2-*"}, {"split": "gain.1", "path": "librispeech_asr-test.clean/gain.1-*"}, {"split": "gain.2", "path": "librispeech_asr-test.clean/gain.2-*"}, {"split": "gain.3", "path": "librispeech_asr-test.clean/gain.3-*"}, {"split": "echo.1", "path": "librispeech_asr-test.clean/echo.1-*"}, {"split": "echo.2", "path": "librispeech_asr-test.clean/echo.2-*"}, {"split": "echo.3", "path": "librispeech_asr-test.clean/echo.3-*"}, {"split": "echo.4", "path": "librispeech_asr-test.clean/echo.4-*"}, {"split": "phaser.1", "path": "librispeech_asr-test.clean/phaser.1-*"}, {"split": "phaser.2", "path": "librispeech_asr-test.clean/phaser.2-*"}, {"split": "phaser.3", "path": "librispeech_asr-test.clean/phaser.3-*"}, {"split": "tempo_up.1", "path": "librispeech_asr-test.clean/tempo_up.1-*"}, {"split": "tempo_up.2", "path": "librispeech_asr-test.clean/tempo_up.2-*"}, {"split": "tempo_up.3", "path": "librispeech_asr-test.clean/tempo_up.3-*"}, {"split": "tempo_up.4", "path": "librispeech_asr-test.clean/tempo_up.4-*"}, {"split": "tempo_down.1", "path": "librispeech_asr-test.clean/tempo_down.1-*"}, {"split": "tempo_down.2", "path": "librispeech_asr-test.clean/tempo_down.2-*"}, {"split": "tempo_down.3", "path": "librispeech_asr-test.clean/tempo_down.3-*"}, {"split": "tempo_down.4", "path": "librispeech_asr-test.clean/tempo_down.4-*"}, {"split": "gain.4", "path": "librispeech_asr-test.clean/gain.4-*"}, {"split": "lowpass.1", "path": "librispeech_asr-test.clean/lowpass.1-*"}, {"split": "lowpass.2", "path": "librispeech_asr-test.clean/lowpass.2-*"}, {"split": "lowpass.3", "path": "librispeech_asr-test.clean/lowpass.3-*"}, {"split": "lowpass.4", "path": "librispeech_asr-test.clean/lowpass.4-*"}, {"split": "highpass.1", "path": "librispeech_asr-test.clean/highpass.1-*"}, {"split": "highpass.2", "path": "librispeech_asr-test.clean/highpass.2-*"}, {"split": "highpass.3", "path": "librispeech_asr-test.clean/highpass.3-*"}, {"split": "highpass.4", "path": "librispeech_asr-test.clean/highpass.4-*"}, {"split": "phaser.4", "path": "librispeech_asr-test.clean/phaser.4-*"}, {"split": "voice_conversion_vctk.1", "path": "librispeech_asr-test.clean/voice_conversion_vctk.1-*"}, {"split": "universal_adv.1", "path": "librispeech_asr-test.clean/universal_adv.1-*"}]}, {"config_name": "librispeech_asr-test.clean_pertEval_500_30", "data_files": [{"split": "gnoise.1", "path": "librispeech_asr-test.clean_pertEval_500_30/gnoise.1-*"}, {"split": "env_noise_esc50.1", "path": "librispeech_asr-test.clean_pertEval_500_30/env_noise_esc50.1-*"}]}, {"config_name": "multilingual_librispeech-spanish_test", "data_files": [{"split": "None.0", "path": "multilingual_librispeech-spanish_test/None.0-*"}, {"split": "gnoise.1", "path": "multilingual_librispeech-spanish_test/gnoise.1-*"}, {"split": "gnoise.2", "path": "multilingual_librispeech-spanish_test/gnoise.2-*"}, {"split": "gnoise.3", "path": "multilingual_librispeech-spanish_test/gnoise.3-*"}, {"split": "gnoise.4", "path": "multilingual_librispeech-spanish_test/gnoise.4-*"}, {"split": "env_noise.1", "path": "multilingual_librispeech-spanish_test/env_noise.1-*"}, {"split": "env_noise.2", "path": "multilingual_librispeech-spanish_test/env_noise.2-*"}, {"split": "env_noise.3", "path": "multilingual_librispeech-spanish_test/env_noise.3-*"}, {"split": "env_noise.4", "path": "multilingual_librispeech-spanish_test/env_noise.4-*"}, {"split": "rir.1", "path": "multilingual_librispeech-spanish_test/rir.1-*"}, {"split": "rir.2", "path": "multilingual_librispeech-spanish_test/rir.2-*"}, {"split": "rir.3", "path": "multilingual_librispeech-spanish_test/rir.3-*"}, {"split": "rir.4", "path": "multilingual_librispeech-spanish_test/rir.4-*"}, {"split": "speedup.1", "path": "multilingual_librispeech-spanish_test/speedup.1-*"}, {"split": "speedup.2", "path": "multilingual_librispeech-spanish_test/speedup.2-*"}, {"split": "speedup.3", "path": "multilingual_librispeech-spanish_test/speedup.3-*"}, {"split": "speedup.4", "path": "multilingual_librispeech-spanish_test/speedup.4-*"}, {"split": "slowdown.1", "path": "multilingual_librispeech-spanish_test/slowdown.1-*"}, {"split": "slowdown.2", "path": "multilingual_librispeech-spanish_test/slowdown.2-*"}, {"split": "slowdown.3", "path": "multilingual_librispeech-spanish_test/slowdown.3-*"}, {"split": "slowdown.4", "path": "multilingual_librispeech-spanish_test/slowdown.4-*"}, {"split": "pitch_up.3", "path": "multilingual_librispeech-spanish_test/pitch_up.3-*"}, {"split": "pitch_up.4", "path": "multilingual_librispeech-spanish_test/pitch_up.4-*"}, {"split": "pitch_down.1", "path": "multilingual_librispeech-spanish_test/pitch_down.1-*"}, {"split": "pitch_down.2", "path": "multilingual_librispeech-spanish_test/pitch_down.2-*"}, {"split": "pitch_down.3", "path": "multilingual_librispeech-spanish_test/pitch_down.3-*"}, {"split": "pitch_down.4", "path": "multilingual_librispeech-spanish_test/pitch_down.4-*"}, {"split": "pitch_up.1", "path": "multilingual_librispeech-spanish_test/pitch_up.1-*"}, {"split": "pitch_up.2", "path": "multilingual_librispeech-spanish_test/pitch_up.2-*"}, {"split": "resample.2", "path": "multilingual_librispeech-spanish_test/resample.2-*"}, {"split": "resample.3", "path": "multilingual_librispeech-spanish_test/resample.3-*"}, {"split": "resample.4", "path": "multilingual_librispeech-spanish_test/resample.4-*"}, {"split": "env_noise_esc50.1", "path": "multilingual_librispeech-spanish_test/env_noise_esc50.1-*"}, {"split": "env_noise_esc50.2", "path": "multilingual_librispeech-spanish_test/env_noise_esc50.2-*"}, {"split": "env_noise_esc50.3", "path": "multilingual_librispeech-spanish_test/env_noise_esc50.3-*"}, {"split": "env_noise_esc50.4", "path": "multilingual_librispeech-spanish_test/env_noise_esc50.4-*"}, {"split": "resample.1", "path": "multilingual_librispeech-spanish_test/resample.1-*"}, {"split": "gain.1", "path": "multilingual_librispeech-spanish_test/gain.1-*"}, {"split": "gain.2", "path": "multilingual_librispeech-spanish_test/gain.2-*"}, {"split": "gain.3", "path": "multilingual_librispeech-spanish_test/gain.3-*"}, {"split": "gain.4", "path": "multilingual_librispeech-spanish_test/gain.4-*"}, {"split": "echo.4", "path": "multilingual_librispeech-spanish_test/echo.4-*"}, {"split": "echo.1", "path": "multilingual_librispeech-spanish_test/echo.1-*"}, {"split": "echo.2", "path": "multilingual_librispeech-spanish_test/echo.2-*"}, {"split": "echo.3", "path": "multilingual_librispeech-spanish_test/echo.3-*"}, {"split": "tempo_up.1", "path": "multilingual_librispeech-spanish_test/tempo_up.1-*"}, {"split": "tempo_up.2", "path": "multilingual_librispeech-spanish_test/tempo_up.2-*"}, {"split": "tempo_up.3", "path": "multilingual_librispeech-spanish_test/tempo_up.3-*"}, {"split": "tempo_up.4", "path": "multilingual_librispeech-spanish_test/tempo_up.4-*"}, {"split": "tempo_down.1", "path": "multilingual_librispeech-spanish_test/tempo_down.1-*"}, {"split": "tempo_down.2", "path": "multilingual_librispeech-spanish_test/tempo_down.2-*"}, {"split": "tempo_down.3", "path": "multilingual_librispeech-spanish_test/tempo_down.3-*"}, {"split": "tempo_down.4", "path": "multilingual_librispeech-spanish_test/tempo_down.4-*"}, {"split": "lowpass.1", "path": "multilingual_librispeech-spanish_test/lowpass.1-*"}, {"split": "lowpass.2", "path": "multilingual_librispeech-spanish_test/lowpass.2-*"}, {"split": "lowpass.3", "path": "multilingual_librispeech-spanish_test/lowpass.3-*"}, {"split": "lowpass.4", "path": "multilingual_librispeech-spanish_test/lowpass.4-*"}, {"split": "highpass.1", "path": "multilingual_librispeech-spanish_test/highpass.1-*"}, {"split": "highpass.2", "path": "multilingual_librispeech-spanish_test/highpass.2-*"}, {"split": "highpass.3", "path": "multilingual_librispeech-spanish_test/highpass.3-*"}, {"split": "highpass.4", "path": "multilingual_librispeech-spanish_test/highpass.4-*"}, {"split": "phaser.1", "path": "multilingual_librispeech-spanish_test/phaser.1-*"}, {"split": "phaser.2", "path": "multilingual_librispeech-spanish_test/phaser.2-*"}, {"split": "phaser.3", "path": "multilingual_librispeech-spanish_test/phaser.3-*"}, {"split": "phaser.4", "path": "multilingual_librispeech-spanish_test/phaser.4-*"}]}, {"config_name": "multilingual_librispeech-spanish_test_pertEval_500_30", "data_files": [{"split": "gnoise.1", "path": "multilingual_librispeech-spanish_test_pertEval_500_30/gnoise.1-*"}, {"split": "env_noise_esc50.1", "path": "multilingual_librispeech-spanish_test_pertEval_500_30/env_noise_esc50.1-*"}]}]}
2024-02-01T05:05:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "speech_robust_bench" More Information needed
[ "# Dataset Card for \"speech_robust_bench\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"speech_robust_bench\"\n\nMore Information needed" ]
c437603ccc12d5dcff4ae066fb03149bfeb4db4e
# Dataset Card for Evaluation run of BelalTab/finetuned-llama2-2048-v3.0 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BelalTab/finetuned-llama2-2048-v3.0](https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T02:20:30.010370](https://huggingface.co/datasets/open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0/blob/main/results_2024-01-21T02-20-30.010370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.46772409508435164, "acc_stderr": 0.03438619577805813, "acc_norm": 0.4725657390462538, "acc_norm_stderr": 0.03515979149976784, "mc1": 0.2974296205630355, "mc1_stderr": 0.016002651487361002, "mc2": 0.4620705172193864, "mc2_stderr": 0.015609209255063306 }, "harness|arc:challenge|25": { "acc": 0.4684300341296928, "acc_stderr": 0.014582236460866982, "acc_norm": 0.49829351535836175, "acc_norm_stderr": 0.014611305705056992 }, "harness|hellaswag|10": { "acc": 0.5805616411073491, "acc_stderr": 0.004924586362301655, "acc_norm": 0.7708623780123481, "acc_norm_stderr": 0.004194190406000104 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.46710526315789475, "acc_stderr": 0.040601270352363966, "acc_norm": 0.46710526315789475, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.539622641509434, "acc_stderr": 0.03067609659938918, "acc_norm": 0.539622641509434, "acc_norm_stderr": 0.03067609659938918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.04161402398403279, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.04161402398403279 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.37572254335260113, "acc_stderr": 0.036928207672648664, "acc_norm": 0.37572254335260113, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171452, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171452 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3659574468085106, "acc_stderr": 0.03148955829745529, "acc_norm": 0.3659574468085106, "acc_norm_stderr": 0.03148955829745529 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.044629175353369355, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.044629175353369355 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.04122737111370332, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.023577604791655802, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.023577604791655802 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235172, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3645320197044335, "acc_stderr": 0.033864057460620905, "acc_norm": 0.3645320197044335, "acc_norm_stderr": 0.033864057460620905 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5575757575757576, "acc_stderr": 0.038783721137112745, "acc_norm": 0.5575757575757576, "acc_norm_stderr": 0.038783721137112745 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5858585858585859, "acc_stderr": 0.03509438348879629, "acc_norm": 0.5858585858585859, "acc_norm_stderr": 0.03509438348879629 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.03257714077709662, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.03257714077709662 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4, "acc_stderr": 0.024838811988033165, "acc_norm": 0.4, "acc_norm_stderr": 0.024838811988033165 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712173, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712173 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3949579831932773, "acc_stderr": 0.031753678460966245, "acc_norm": 0.3949579831932773, "acc_norm_stderr": 0.031753678460966245 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6477064220183486, "acc_stderr": 0.02048056884399899, "acc_norm": 0.6477064220183486, "acc_norm_stderr": 0.02048056884399899 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35648148148148145, "acc_stderr": 0.032664783315272714, "acc_norm": 0.35648148148148145, "acc_norm_stderr": 0.032664783315272714 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5931372549019608, "acc_stderr": 0.03447891136353382, "acc_norm": 0.5931372549019608, "acc_norm_stderr": 0.03447891136353382 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.5822784810126582, "acc_stderr": 0.032103530322412685, "acc_norm": 0.5822784810126582, "acc_norm_stderr": 0.032103530322412685 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.547085201793722, "acc_stderr": 0.03340867501923324, "acc_norm": 0.547085201793722, "acc_norm_stderr": 0.03340867501923324 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5419847328244275, "acc_stderr": 0.04369802690578756, "acc_norm": 0.5419847328244275, "acc_norm_stderr": 0.04369802690578756 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6363636363636364, "acc_stderr": 0.043913262867240704, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.043913262867240704 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5740740740740741, "acc_stderr": 0.0478034362693679, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5153374233128835, "acc_stderr": 0.03926522378708843, "acc_norm": 0.5153374233128835, "acc_norm_stderr": 0.03926522378708843 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.04059867246952688, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.04059867246952688 }, "harness|hendrycksTest-management|5": { "acc": 0.6310679611650486, "acc_stderr": 0.0477761518115674, "acc_norm": 0.6310679611650486, "acc_norm_stderr": 0.0477761518115674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7094017094017094, "acc_stderr": 0.029745048572674054, "acc_norm": 0.7094017094017094, "acc_norm_stderr": 0.029745048572674054 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6462324393358876, "acc_stderr": 0.01709818470816191, "acc_norm": 0.6462324393358876, "acc_norm_stderr": 0.01709818470816191 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5375722543352601, "acc_stderr": 0.026842985519615375, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.026842985519615375 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.22793296089385476, "acc_stderr": 0.014030149950805097, "acc_norm": 0.22793296089385476, "acc_norm_stderr": 0.014030149950805097 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5163398692810458, "acc_stderr": 0.02861462475280544, "acc_norm": 0.5163398692810458, "acc_norm_stderr": 0.02861462475280544 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5530546623794212, "acc_stderr": 0.028237769422085324, "acc_norm": 0.5530546623794212, "acc_norm_stderr": 0.028237769422085324 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5617283950617284, "acc_stderr": 0.027607914087400473, "acc_norm": 0.5617283950617284, "acc_norm_stderr": 0.027607914087400473 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.32269503546099293, "acc_stderr": 0.02788913930053478, "acc_norm": 0.32269503546099293, "acc_norm_stderr": 0.02788913930053478 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3363754889178618, "acc_stderr": 0.01206708307945222, "acc_norm": 0.3363754889178618, "acc_norm_stderr": 0.01206708307945222 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4375, "acc_stderr": 0.030134614954403924, "acc_norm": 0.4375, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4477124183006536, "acc_stderr": 0.020116925347422425, "acc_norm": 0.4477124183006536, "acc_norm_stderr": 0.020116925347422425 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4775510204081633, "acc_stderr": 0.031976941187136725, "acc_norm": 0.4775510204081633, "acc_norm_stderr": 0.031976941187136725 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5572139303482587, "acc_stderr": 0.03512310964123935, "acc_norm": 0.5572139303482587, "acc_norm_stderr": 0.03512310964123935 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.038367221765980515, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.038367221765980515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.035282112582452306, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.035282112582452306 }, "harness|truthfulqa:mc|0": { "mc1": 0.2974296205630355, "mc1_stderr": 0.016002651487361002, "mc2": 0.4620705172193864, "mc2_stderr": 0.015609209255063306 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.012610826539404686 }, "harness|gsm8k|5": { "acc": 0.14935557240333586, "acc_stderr": 0.009818090723727286 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0
[ "region:us" ]
2024-01-21T02:22:50+00:00
{"pretty_name": "Evaluation run of BelalTab/finetuned-llama2-2048-v3.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [BelalTab/finetuned-llama2-2048-v3.0](https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T02:20:30.010370](https://huggingface.co/datasets/open-llm-leaderboard/details_BelalTab__finetuned-llama2-2048-v3.0/blob/main/results_2024-01-21T02-20-30.010370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46772409508435164,\n \"acc_stderr\": 0.03438619577805813,\n \"acc_norm\": 0.4725657390462538,\n \"acc_norm_stderr\": 0.03515979149976784,\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4620705172193864,\n \"mc2_stderr\": 0.015609209255063306\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866982,\n \"acc_norm\": 0.49829351535836175,\n \"acc_norm_stderr\": 0.014611305705056992\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5805616411073491,\n \"acc_stderr\": 0.004924586362301655,\n \"acc_norm\": 0.7708623780123481,\n \"acc_norm_stderr\": 0.004194190406000104\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.03067609659938918,\n \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.03067609659938918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.03148955829745529,\n \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.03148955829745529\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.044629175353369355,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.044629175353369355\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.038783721137112745,\n \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.038783721137112745\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6477064220183486,\n \"acc_stderr\": 0.02048056884399899,\n \"acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.02048056884399899\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5822784810126582,\n \"acc_stderr\": 0.032103530322412685,\n \"acc_norm\": 0.5822784810126582,\n \"acc_norm_stderr\": 0.032103530322412685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n \"acc_stderr\": 0.029745048572674054,\n \"acc_norm\": 0.7094017094017094,\n \"acc_norm_stderr\": 0.029745048572674054\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n \"acc_stderr\": 0.01709818470816191,\n \"acc_norm\": 0.6462324393358876,\n \"acc_norm_stderr\": 0.01709818470816191\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n \"acc_stderr\": 0.028237769422085324,\n \"acc_norm\": 0.5530546623794212,\n \"acc_norm_stderr\": 0.028237769422085324\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400473,\n \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400473\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053478,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053478\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3363754889178618,\n \"acc_stderr\": 0.01206708307945222,\n \"acc_norm\": 0.3363754889178618,\n \"acc_norm_stderr\": 0.01206708307945222\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4477124183006536,\n \"acc_stderr\": 0.020116925347422425,\n \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.020116925347422425\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123935,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.035282112582452306,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.035282112582452306\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.4620705172193864,\n \"mc2_stderr\": 0.015609209255063306\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \"acc_stderr\": 0.009818090723727286\n }\n}\n```", "repo_url": "https://huggingface.co/BelalTab/finetuned-llama2-2048-v3.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["**/details_harness|winogrande|5_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T02-20-30.010370.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T02_20_30.010370", "path": ["results_2024-01-21T02-20-30.010370.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T02-20-30.010370.parquet"]}]}]}
2024-01-21T02:23:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BelalTab/finetuned-llama2-2048-v3.0 Dataset automatically created during the evaluation run of model BelalTab/finetuned-llama2-2048-v3.0 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T02:20:30.010370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BelalTab/finetuned-llama2-2048-v3.0\n\n\n\nDataset automatically created during the evaluation run of model BelalTab/finetuned-llama2-2048-v3.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:20:30.010370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BelalTab/finetuned-llama2-2048-v3.0\n\n\n\nDataset automatically created during the evaluation run of model BelalTab/finetuned-llama2-2048-v3.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:20:30.010370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
64ce2c96e61cc6a64b1f1bc82e9147d31b60fd22
# Dataset Card for Evaluation run of 222gate/Blurred-Beagle-7b-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [222gate/Blurred-Beagle-7b-slerp](https://huggingface.co/222gate/Blurred-Beagle-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_222gate__Blurred-Beagle-7b-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T02:38:24.249991](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurred-Beagle-7b-slerp/blob/main/results_2024-01-21T02-38-24.249991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6553336331341092, "acc_stderr": 0.03202704595605816, "acc_norm": 0.6548697642825446, "acc_norm_stderr": 0.03269176917171591, "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.6938704564389065, "mc2_stderr": 0.015145137123492 }, "harness|arc:challenge|25": { "acc": 0.6996587030716723, "acc_stderr": 0.013395909309957002, "acc_norm": 0.7278156996587031, "acc_norm_stderr": 0.013006600406423704 }, "harness|hellaswag|10": { "acc": 0.7210714997012547, "acc_stderr": 0.004475557360359705, "acc_norm": 0.8857797251543518, "acc_norm_stderr": 0.0031742854949621644 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404907, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404907 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.023710888501970565, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.023710888501970565 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.02882088466625326, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.02882088466625326 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092434, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092434 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250447, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250447 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608308, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608308 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069363, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069363 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4569832402234637, "acc_stderr": 0.01666049858050917, "acc_norm": 0.4569832402234637, "acc_norm_stderr": 0.01666049858050917 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694912, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694912 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869649, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869649 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039655, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039655 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6617647058823529, "acc_stderr": 0.019139943748487043, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.019139943748487043 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197771, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197771 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.6938704564389065, "mc2_stderr": 0.015145137123492 }, "harness|winogrande|5": { "acc": 0.8318863456985004, "acc_stderr": 0.010510336954166736 }, "harness|gsm8k|5": { "acc": 0.6990144048521607, "acc_stderr": 0.01263450446521118 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_222gate__Blurred-Beagle-7b-slerp
[ "region:us" ]
2024-01-21T02:40:42+00:00
{"pretty_name": "Evaluation run of 222gate/Blurred-Beagle-7b-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/Blurred-Beagle-7b-slerp](https://huggingface.co/222gate/Blurred-Beagle-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__Blurred-Beagle-7b-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T02:38:24.249991](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurred-Beagle-7b-slerp/blob/main/results_2024-01-21T02-38-24.249991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6553336331341092,\n \"acc_stderr\": 0.03202704595605816,\n \"acc_norm\": 0.6548697642825446,\n \"acc_norm_stderr\": 0.03269176917171591,\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.6938704564389065,\n \"mc2_stderr\": 0.015145137123492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957002,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423704\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7210714997012547,\n \"acc_stderr\": 0.004475557360359705,\n \"acc_norm\": 0.8857797251543518,\n \"acc_norm_stderr\": 0.0031742854949621644\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.6938704564389065,\n \"mc2_stderr\": 0.015145137123492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166736\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6990144048521607,\n \"acc_stderr\": 0.01263450446521118\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/Blurred-Beagle-7b-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-24.249991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["**/details_harness|winogrande|5_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T02-38-24.249991.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T02_38_24.249991", "path": ["results_2024-01-21T02-38-24.249991.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T02-38-24.249991.parquet"]}]}]}
2024-01-21T02:41:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 222gate/Blurred-Beagle-7b-slerp Dataset automatically created during the evaluation run of model 222gate/Blurred-Beagle-7b-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T02:38:24.249991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 222gate/Blurred-Beagle-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blurred-Beagle-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:38:24.249991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 222gate/Blurred-Beagle-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blurred-Beagle-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:38:24.249991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f1b3429202591e1499fad1013e64bea4b6c44987
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2-Test1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyMistral-v2-Test1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2-Test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2-Test1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T02:38:49.773813](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2-Test1/blob/main/results_2024-01-21T02-38-49.773813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2335310199962483, "acc_stderr": 0.02999531007525961, "acc_norm": 0.23385996059713224, "acc_norm_stderr": 0.03078636978062643, "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707703, "mc2": 0.5030342289474727, "mc2_stderr": 0.015464982097707176 }, "harness|arc:challenge|25": { "acc": 0.18344709897610922, "acc_stderr": 0.011310170179554543, "acc_norm": 0.2150170648464164, "acc_norm_stderr": 0.01200571763413361 }, "harness|hellaswag|10": { "acc": 0.261700856403107, "acc_stderr": 0.004386622589119065, "acc_norm": 0.2678749253136825, "acc_norm_stderr": 0.00441946998393918 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.1925925925925926, "acc_stderr": 0.03406542058502653, "acc_norm": 0.1925925925925926, "acc_norm_stderr": 0.03406542058502653 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17105263157894737, "acc_stderr": 0.030643607071677088, "acc_norm": 0.17105263157894737, "acc_norm_stderr": 0.030643607071677088 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.20754716981132076, "acc_stderr": 0.02495991802891127, "acc_norm": 0.20754716981132076, "acc_norm_stderr": 0.02495991802891127 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2777777777777778, "acc_stderr": 0.037455547914624555, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.16, "acc_stderr": 0.03684529491774709, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.22, "acc_stderr": 0.0416333199893227, "acc_norm": 0.22, "acc_norm_stderr": 0.0416333199893227 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.21965317919075145, "acc_stderr": 0.031568093627031744, "acc_norm": 0.21965317919075145, "acc_norm_stderr": 0.031568093627031744 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.02880998985410297, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.02880998985410297 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02113285918275444, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02113285918275444 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.18387096774193548, "acc_stderr": 0.022037217340267836, "acc_norm": 0.18387096774193548, "acc_norm_stderr": 0.022037217340267836 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.19704433497536947, "acc_stderr": 0.027986724666736205, "acc_norm": 0.19704433497536947, "acc_norm_stderr": 0.027986724666736205 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18181818181818182, "acc_stderr": 0.027479603010538797, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.027479603010538797 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2, "acc_stderr": 0.020280805062535722, "acc_norm": 0.2, "acc_norm_stderr": 0.020280805062535722 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.21851851851851853, "acc_stderr": 0.02519575225182379, "acc_norm": 0.21851851851851853, "acc_norm_stderr": 0.02519575225182379 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.22268907563025211, "acc_stderr": 0.027025433498882392, "acc_norm": 0.22268907563025211, "acc_norm_stderr": 0.027025433498882392 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1834862385321101, "acc_stderr": 0.01659525971039931, "acc_norm": 0.1834862385321101, "acc_norm_stderr": 0.01659525971039931 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134217, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134217 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.30493273542600896, "acc_stderr": 0.030898610882477515, "acc_norm": 0.30493273542600896, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.24427480916030533, "acc_stderr": 0.037683359597287434, "acc_norm": 0.24427480916030533, "acc_norm_stderr": 0.037683359597287434 }, "harness|hendrycksTest-international_law|5": { "acc": 0.23140495867768596, "acc_stderr": 0.03849856098794088, "acc_norm": 0.23140495867768596, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2147239263803681, "acc_stderr": 0.03226219377286774, "acc_norm": 0.2147239263803681, "acc_norm_stderr": 0.03226219377286774 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2948717948717949, "acc_stderr": 0.029872577708891148, "acc_norm": 0.2948717948717949, "acc_norm_stderr": 0.029872577708891148 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23627075351213284, "acc_stderr": 0.0151904737170375, "acc_norm": 0.23627075351213284, "acc_norm_stderr": 0.0151904737170375 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399201, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399201 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2223463687150838, "acc_stderr": 0.01390718920815688, "acc_norm": 0.2223463687150838, "acc_norm_stderr": 0.01390718920815688 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.21895424836601307, "acc_stderr": 0.02367908986180772, "acc_norm": 0.21895424836601307, "acc_norm_stderr": 0.02367908986180772 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2222222222222222, "acc_stderr": 0.023132376234543336, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.023132376234543336 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.22695035460992907, "acc_stderr": 0.02498710636564297, "acc_norm": 0.22695035460992907, "acc_norm_stderr": 0.02498710636564297 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24511082138200782, "acc_stderr": 0.010986307870045517, "acc_norm": 0.24511082138200782, "acc_norm_stderr": 0.010986307870045517 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.26838235294117646, "acc_stderr": 0.0269174812243772, "acc_norm": 0.26838235294117646, "acc_norm_stderr": 0.0269174812243772 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2549019607843137, "acc_stderr": 0.017630827375148383, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.017630827375148383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.30120481927710846, "acc_stderr": 0.0357160923005348, "acc_norm": 0.30120481927710846, "acc_norm_stderr": 0.0357160923005348 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707703, "mc2": 0.5030342289474727, "mc2_stderr": 0.015464982097707176 }, "harness|winogrande|5": { "acc": 0.48539857932123126, "acc_stderr": 0.01404649238327584 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2-Test1
[ "region:us" ]
2024-01-21T02:41:08+00:00
{"pretty_name": "Evaluation run of Dans-DiscountModels/TinyMistral-v2-Test1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyMistral-v2-Test1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2-Test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2-Test1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T02:38:49.773813](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2-Test1/blob/main/results_2024-01-21T02-38-49.773813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2335310199962483,\n \"acc_stderr\": 0.02999531007525961,\n \"acc_norm\": 0.23385996059713224,\n \"acc_norm_stderr\": 0.03078636978062643,\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707703,\n \"mc2\": 0.5030342289474727,\n \"mc2_stderr\": 0.015464982097707176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18344709897610922,\n \"acc_stderr\": 0.011310170179554543,\n \"acc_norm\": 0.2150170648464164,\n \"acc_norm_stderr\": 0.01200571763413361\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.261700856403107,\n \"acc_stderr\": 0.004386622589119065,\n \"acc_norm\": 0.2678749253136825,\n \"acc_norm_stderr\": 0.00441946998393918\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275444,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.027986724666736205,\n \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.027986724666736205\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882392,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882392\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1834862385321101,\n \"acc_stderr\": 0.01659525971039931,\n \"acc_norm\": 0.1834862385321101,\n \"acc_norm_stderr\": 0.01659525971039931\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134217,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134217\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23627075351213284,\n \"acc_stderr\": 0.0151904737170375,\n \"acc_norm\": 0.23627075351213284,\n \"acc_norm_stderr\": 0.0151904737170375\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2223463687150838,\n \"acc_stderr\": 0.01390718920815688,\n \"acc_norm\": 0.2223463687150838,\n \"acc_norm_stderr\": 0.01390718920815688\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543336,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543336\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.02498710636564297,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.02498710636564297\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.26838235294117646,\n \"acc_stderr\": 0.0269174812243772,\n \"acc_norm\": 0.26838235294117646,\n \"acc_norm_stderr\": 0.0269174812243772\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707703,\n \"mc2\": 0.5030342289474727,\n \"mc2_stderr\": 0.015464982097707176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.01404649238327584\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Dans-DiscountModels/TinyMistral-v2-Test1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-49.773813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["**/details_harness|winogrande|5_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T02-38-49.773813.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T02_38_49.773813", "path": ["results_2024-01-21T02-38-49.773813.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T02-38-49.773813.parquet"]}]}]}
2024-01-21T02:41:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2-Test1 Dataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2-Test1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T02:38:49.773813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2-Test1\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2-Test1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:38:49.773813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2-Test1\n\n\n\nDataset automatically created during the evaluation run of model Dans-DiscountModels/TinyMistral-v2-Test1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T02:38:49.773813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6731cc1c564c90b9a1f341bb3f7555ad5deb69a4
# Dataset Card for "lnl-imdb" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davidpig/lnl-imdb
[ "region:us" ]
2024-01-21T02:58:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 28437070, "num_examples": 21246}, {"name": "validation", "num_bytes": 4995701, "num_examples": 3754}, {"name": "test", "num_bytes": 32650581, "num_examples": 25000}], "download_size": 42923713, "dataset_size": 66083352}}
2024-01-21T02:59:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for "lnl-imdb" More Information needed
[ "# Dataset Card for \"lnl-imdb\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"lnl-imdb\"\n\nMore Information needed" ]
d401d99bbdd11374d573875ba7fe1e7ca7593fc6
# Dataset Card for "lnl-youruba" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davidpig/lnl-youruba
[ "region:us" ]
2024-01-21T03:01:09+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "clean_label", "dtype": "string"}, {"name": "clean_label_id", "dtype": "int64"}, {"name": "noisy_label", "dtype": "string"}, {"name": "noisy_label_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 186669, "num_examples": 1340}, {"name": "validation", "num_bytes": 26221, "num_examples": 189}, {"name": "test", "num_bytes": 52598, "num_examples": 379}], "download_size": 136869, "dataset_size": 265488}}
2024-01-21T03:01:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "lnl-youruba" More Information needed
[ "# Dataset Card for \"lnl-youruba\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"lnl-youruba\"\n\nMore Information needed" ]
4bcd61d6f55b328c52aa17fcb8716ffea7ce6b66
# Dataset Card for "lnl-hausa" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davidpig/lnl-hausa
[ "region:us" ]
2024-01-21T03:01:36+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "clean_label", "dtype": "string"}, {"name": "clean_label_id", "dtype": "int64"}, {"name": "noisy_label", "dtype": "string"}, {"name": "noisy_label_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 202396, "num_examples": 2045}, {"name": "validation", "num_bytes": 28707, "num_examples": 290}, {"name": "test", "num_bytes": 57551, "num_examples": 582}], "download_size": 133910, "dataset_size": 288654}}
2024-01-21T03:01:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for "lnl-hausa" More Information needed
[ "# Dataset Card for \"lnl-hausa\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"lnl-hausa\"\n\nMore Information needed" ]
ffb4021aeda508cccd7d5cf1f13ca43cc925e807
This dataset was generated by reformatting [`coref-data/preco_raw`](https://huggingface.co/datasets/coref-data/preco_raw) into the indiscrim coreference format. See that repo for dataset details. See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
coref-data/preco_indiscrim
[ "region:us" ]
2024-01-21T03:06:42+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 445303441.3894047, "num_examples": 36120}, {"name": "validation", "num_bytes": 6164222.610595303, "num_examples": 500}, {"name": "test", "num_bytes": 6053901, "num_examples": 500}], "download_size": 126986138, "dataset_size": 457521565.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-21T18:50:08+00:00
[]
[]
TAGS #region-us
This dataset was generated by reformatting 'coref-data/preco_raw' into the indiscrim coreference format. See that repo for dataset details. See ianporada/coref-data for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
[]
[ "TAGS\n#region-us \n" ]
2dd32dd409eb4d057f3dc006281a29ee077313fc
# IMO geometry questions 32 IMO geometry questions from 2000 to 2021 (filter by category "IMO") Data source : [https://artofproblemsolving.com/wiki/index.php/Category:Olympiad_Geometry_Problems](https://artofproblemsolving.com/wiki/index.php/Category:Olympiad_Geometry_Problems) 55 more questions from others (other regional olympiad competition) as well as 13 GPT-4 generate ones. Only the raw questions are available, if you want to use them for alpha geometry there's still a missing translation step. This is the example shown in Alpha Geometry Question: ``` Let ABC be an acute-angled triangle with AB ≠ AC. The circle with diameter BC intersects the sides AB and AC at M and N respectively. Denote by O the midpoint of the side BC. The bisectors of the angles ∠BAC and ∠MON intersect at R. Prove that the circumcircles of the triangles BMR and CNR have a common point lying on the side BC. ``` Translated: ``` Premise A B C O M N R P : Points mid_point(O,B,C) [--] same_line(B,M,A) [00] OM=OB [01] same_line(N,C,A) [02] ON=OB [03] ∠BAR=∠RAC [04] ∠MOR=∠RON [05] circle(B,M,R,P) [06] circle(C,N,R,P) [07] Goal same_line(P, B, C) ```
theblackcat102/IMO-geometry
[ "language:en", "license:mit", "IMO", "geometry", "math", "region:us" ]
2024-01-21T03:23:42+00:00
{"language": ["en"], "license": "mit", "dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 33953, "num_examples": 87}], "download_size": 18740, "dataset_size": 33953}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "tags": ["IMO", "geometry", "math"]}
2024-01-21T03:30:13+00:00
[]
[ "en" ]
TAGS #language-English #license-mit #IMO #geometry #math #region-us
# IMO geometry questions 32 IMO geometry questions from 2000 to 2021 (filter by category "IMO") Data source : URL 55 more questions from others (other regional olympiad competition) as well as 13 GPT-4 generate ones. Only the raw questions are available, if you want to use them for alpha geometry there's still a missing translation step. This is the example shown in Alpha Geometry Question: Translated:
[ "# IMO geometry questions\n\n32 IMO geometry questions from 2000 to 2021 (filter by category \"IMO\")\n\nData source : URL\n\n55 more questions from others (other regional olympiad competition) as well as 13 GPT-4 generate ones.\n\nOnly the raw questions are available, if you want to use them for alpha geometry there's still a missing translation step.\n\nThis is the example shown in Alpha Geometry\n\nQuestion:\n\n\nTranslated:" ]
[ "TAGS\n#language-English #license-mit #IMO #geometry #math #region-us \n", "# IMO geometry questions\n\n32 IMO geometry questions from 2000 to 2021 (filter by category \"IMO\")\n\nData source : URL\n\n55 more questions from others (other regional olympiad competition) as well as 13 GPT-4 generate ones.\n\nOnly the raw questions are available, if you want to use them for alpha geometry there's still a missing translation step.\n\nThis is the example shown in Alpha Geometry\n\nQuestion:\n\n\nTranslated:" ]
6ee1315e66f7e03870395998874b44d9c93acb1a
# Dataset Card for Evaluation run of luqmanxyz/LelaStarling-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [luqmanxyz/LelaStarling-7B](https://huggingface.co/luqmanxyz/LelaStarling-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_luqmanxyz__LelaStarling-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T03:22:03.188309](https://huggingface.co/datasets/open-llm-leaderboard/details_luqmanxyz__LelaStarling-7B/blob/main/results_2024-01-21T03-22-03.188309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.653782163258742, "acc_stderr": 0.03202901372406034, "acc_norm": 0.6538975569099659, "acc_norm_stderr": 0.03268928703694758, "mc1": 0.40636474908200737, "mc1_stderr": 0.017193835812093897, "mc2": 0.5772632901338711, "mc2_stderr": 0.015444224853170872 }, "harness|arc:challenge|25": { "acc": 0.6382252559726962, "acc_stderr": 0.014041957945038075, "acc_norm": 0.6757679180887372, "acc_norm_stderr": 0.013678810399518824 }, "harness|hellaswag|10": { "acc": 0.6806413065126469, "acc_stderr": 0.004652753439460136, "acc_norm": 0.8632742481577375, "acc_norm_stderr": 0.00342855459595022 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.028637235639800886, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.028637235639800886 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.02882088466625326, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.02882088466625326 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8513761467889909, "acc_stderr": 0.015251253773660834, "acc_norm": 0.8513761467889909, "acc_norm_stderr": 0.015251253773660834 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579654, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579654 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545843, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545843 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594654, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608313, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608313 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.016519594275297117, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.016519594275297117 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667888, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667888 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4771838331160365, "acc_stderr": 0.0127569333828237, "acc_norm": 0.4771838331160365, "acc_norm_stderr": 0.0127569333828237 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6895424836601307, "acc_stderr": 0.018718067052623234, "acc_norm": 0.6895424836601307, "acc_norm_stderr": 0.018718067052623234 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291293, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291293 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.40636474908200737, "mc1_stderr": 0.017193835812093897, "mc2": 0.5772632901338711, "mc2_stderr": 0.015444224853170872 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.011030335798617443 }, "harness|gsm8k|5": { "acc": 0.711144806671721, "acc_stderr": 0.012484219800126666 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_luqmanxyz__LelaStarling-7B
[ "region:us" ]
2024-01-21T03:24:22+00:00
{"pretty_name": "Evaluation run of luqmanxyz/LelaStarling-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [luqmanxyz/LelaStarling-7B](https://huggingface.co/luqmanxyz/LelaStarling-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luqmanxyz__LelaStarling-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T03:22:03.188309](https://huggingface.co/datasets/open-llm-leaderboard/details_luqmanxyz__LelaStarling-7B/blob/main/results_2024-01-21T03-22-03.188309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653782163258742,\n \"acc_stderr\": 0.03202901372406034,\n \"acc_norm\": 0.6538975569099659,\n \"acc_norm_stderr\": 0.03268928703694758,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5772632901338711,\n \"mc2_stderr\": 0.015444224853170872\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038075,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518824\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6806413065126469,\n \"acc_stderr\": 0.004652753439460136,\n \"acc_norm\": 0.8632742481577375,\n \"acc_norm_stderr\": 0.00342855459595022\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667888,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667888\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623234,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623234\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093897,\n \"mc2\": 0.5772632901338711,\n \"mc2_stderr\": 0.015444224853170872\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.011030335798617443\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.711144806671721,\n \"acc_stderr\": 0.012484219800126666\n }\n}\n```", "repo_url": "https://huggingface.co/luqmanxyz/LelaStarling-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|arc:challenge|25_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|gsm8k|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hellaswag|10_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T03-22-03.188309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["**/details_harness|winogrande|5_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T03-22-03.188309.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T03_22_03.188309", "path": ["results_2024-01-21T03-22-03.188309.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T03-22-03.188309.parquet"]}]}]}
2024-01-21T03:24:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of luqmanxyz/LelaStarling-7B Dataset automatically created during the evaluation run of model luqmanxyz/LelaStarling-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T03:22:03.188309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of luqmanxyz/LelaStarling-7B\n\n\n\nDataset automatically created during the evaluation run of model luqmanxyz/LelaStarling-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T03:22:03.188309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of luqmanxyz/LelaStarling-7B\n\n\n\nDataset automatically created during the evaluation run of model luqmanxyz/LelaStarling-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T03:22:03.188309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
58b5fdd88d139b17ffd75365a5e1ec5b53b8dd43
This dataset was generated by reformatting [`coref-data/litbank_raw`](https://huggingface.co/datasets/coref-data/litbank_raw) into the indiscrim coreference format. See that repo for dataset details. See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
coref-data/litbank_indiscrim
[ "region:us" ]
2024-01-21T03:30:36+00:00
{"dataset_info": [{"config_name": "split_0", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "end_char", "dtype": "int64"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "start_char", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 66722053, "num_examples": 80}, {"name": "validation", "num_bytes": 9538946, "num_examples": 10}, {"name": "test", "num_bytes": 10206291, "num_examples": 10}], "download_size": 44024474, "dataset_size": 86467290}, {"config_name": "split_1", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 51521261, "num_examples": 80}, {"name": "validation", "num_bytes": 8300522, "num_examples": 10}, {"name": "test", "num_bytes": 7127546, "num_examples": 10}], "download_size": 40296693, "dataset_size": 66949329}, {"config_name": "split_2", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 51695718, "num_examples": 80}, {"name": "validation", "num_bytes": 7127546, "num_examples": 10}, {"name": "test", "num_bytes": 8126065, "num_examples": 10}], "download_size": 40287905, "dataset_size": 66949329}, {"config_name": "split_3", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 52504381, "num_examples": 80}, {"name": "validation", "num_bytes": 8126065, "num_examples": 10}, {"name": "test", "num_bytes": 6318883, "num_examples": 10}], "download_size": 40292412, "dataset_size": 66949329}, {"config_name": "split_4", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 54684836, "num_examples": 80}, {"name": "validation", "num_bytes": 6318883, "num_examples": 10}, {"name": "test", "num_bytes": 5945610, "num_examples": 10}], "download_size": 40283365, "dataset_size": 66949329}, {"config_name": "split_5", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 53798360, "num_examples": 80}, {"name": "validation", "num_bytes": 5945610, "num_examples": 10}, {"name": "test", "num_bytes": 7205359, "num_examples": 10}], "download_size": 40284379, "dataset_size": 66949329}, {"config_name": "split_6", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 53481831, "num_examples": 80}, {"name": "validation", "num_bytes": 7205359, "num_examples": 10}, {"name": "test", "num_bytes": 6262139, "num_examples": 10}], "download_size": 40294155, "dataset_size": 66949329}, {"config_name": "split_7", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 54849391, "num_examples": 80}, {"name": "validation", "num_bytes": 6262139, "num_examples": 10}, {"name": "test", "num_bytes": 5837799, "num_examples": 10}], "download_size": 40294847, "dataset_size": 66949329}, {"config_name": "split_8", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 56921350, "num_examples": 80}, {"name": "validation", "num_bytes": 5837799, "num_examples": 10}, {"name": "test", "num_bytes": 4190180, "num_examples": 10}], "download_size": 40292974, "dataset_size": 66949329}, {"config_name": "split_9", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "null"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}]}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "author", "dtype": "string"}, {"name": "comment", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "gutenberg_id", "dtype": "string"}, {"name": "title", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 55123923, "num_examples": 80}, {"name": "validation", "num_bytes": 4190180, "num_examples": 10}, {"name": "test", "num_bytes": 7635226, "num_examples": 10}], "download_size": 40294593, "dataset_size": 66949329}], "configs": [{"config_name": "split_0", "data_files": [{"split": "train", "path": "split_0/train-*"}, {"split": "validation", "path": "split_0/validation-*"}, {"split": "test", "path": "split_0/test-*"}]}, {"config_name": "split_1", "data_files": [{"split": "train", "path": "split_1/train-*"}, {"split": "validation", "path": "split_1/validation-*"}, {"split": "test", "path": "split_1/test-*"}]}, {"config_name": "split_2", "data_files": [{"split": "train", "path": "split_2/train-*"}, {"split": "validation", "path": "split_2/validation-*"}, {"split": "test", "path": "split_2/test-*"}]}, {"config_name": "split_3", "data_files": [{"split": "train", "path": "split_3/train-*"}, {"split": "validation", "path": "split_3/validation-*"}, {"split": "test", "path": "split_3/test-*"}]}, {"config_name": "split_4", "data_files": [{"split": "train", "path": "split_4/train-*"}, {"split": "validation", "path": "split_4/validation-*"}, {"split": "test", "path": "split_4/test-*"}]}, {"config_name": "split_5", "data_files": [{"split": "train", "path": "split_5/train-*"}, {"split": "validation", "path": "split_5/validation-*"}, {"split": "test", "path": "split_5/test-*"}]}, {"config_name": "split_6", "data_files": [{"split": "train", "path": "split_6/train-*"}, {"split": "validation", "path": "split_6/validation-*"}, {"split": "test", "path": "split_6/test-*"}]}, {"config_name": "split_7", "data_files": [{"split": "train", "path": "split_7/train-*"}, {"split": "validation", "path": "split_7/validation-*"}, {"split": "test", "path": "split_7/test-*"}]}, {"config_name": "split_8", "data_files": [{"split": "train", "path": "split_8/train-*"}, {"split": "validation", "path": "split_8/validation-*"}, {"split": "test", "path": "split_8/test-*"}]}, {"config_name": "split_9", "data_files": [{"split": "train", "path": "split_9/train-*"}, {"split": "validation", "path": "split_9/validation-*"}, {"split": "test", "path": "split_9/test-*"}]}]}
2024-02-13T03:12:58+00:00
[]
[]
TAGS #region-us
This dataset was generated by reformatting 'coref-data/litbank_raw' into the indiscrim coreference format. See that repo for dataset details. See ianporada/coref-data for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
[]
[ "TAGS\n#region-us \n" ]
869f61d2cf741a5ab17662c241521fd4ed364e3a
## Python Copilot Instructions on How to Code using Alpaca and Yaml This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset. ### Details Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more. - Rows: 1737704 - Size: 28.6 GB - Data type: text - Format: Introduction on code usage using alpaca and yaml response ### Schema The instruction alpaca text with yaml response is in the **desc** column: ```json { "active": "bool", "args": "string", "args_len": "float64", "audio_file": "string", "audio_path": "string", "class_bases": "string", "class_name": "string", "code": "string", "code_len": "float64", "desc": "string", "desc_docstr": "string", "desc_docstr_len": "float64", "desc_len": "int64", "docstr": "string", "docstr_len": "int64", "file_path": "string", "file_type": "string", "function_names": "string", "gen_bytes": "int64", "gen_data_type": "string", "gen_mode": "string", "gen_size": "int64", "gen_valid": "string", "height": "int64", "image_file": "string", "image_path": "string", "method_names": "string", "name": "string", "num_all_bases": "int64", "num_bases": "int64", "num_classes": "int64", "num_functions": "float64", "num_imports": "int64", "num_methods": "float64", "prompts": "string", "raises": "string", "raises_len": "float64", "recsize": "int64", "repo": "string", "returns": "string", "returns_len": "float64", "size": "int64", "src_object": "string", "sub_file": "string", "total_objects": "int64", "usage": "string", "usages": "string", "width": "int64" } ``` ### How to use the dataset ```python from datasets import load_dataset ds = load_dataset("matlok/python-text-copilot-training-instruct", data_dir="files") ```
matlok/python-text-copilot-training-instruct
[ "task_categories:text-generation", "task_categories:question-answering", "task_ids:parsing", "size_categories:1M<n<10M", "license:other", "python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "coding", "task", "prompt", "response", "yaml", "region:us" ]
2024-01-21T04:00:31+00:00
{"license": ["other"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot instructions on how to code using alpaca and yaml", "dataset_info": [{"config_name": "view_01_transformers_src", "splits": [{"name": "view_01_transformers_src"}]}, {"config_name": "view_02_pytorch_fsdp", "splits": [{"name": "view_02_pytorch_fsdp"}]}, {"config_name": "view_03_deepspeed_runtime", "splits": [{"name": "view_03_deepspeed_runtime"}]}, {"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "view_01_transformers_src", "data_files": [{"split": "view_01_transformers_src", "path": "files/lok-python-copilot-text.instruct-v1_00000053.parquet"}]}, {"config_name": "view_02_pytorch_fsdp", "data_files": [{"split": "view_02_pytorch_fsdp", "path": "files/lok-python-copilot-text.instruct-v1_00000040.parquet"}]}, {"config_name": "view_03_deepspeed_runtime", "data_files": [{"split": "view_03_deepspeed_runtime", "path": "files/lok-python-copilot-text.instruct-v1_00000019.parquet"}]}, {"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-text.instruct-v1_00000002.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "coding", "task", "prompt", "response", "yaml"]}
2024-01-25T19:18:34+00:00
[]
[]
TAGS #task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us
## Python Copilot Instructions on How to Code using Alpaca and Yaml This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset. ### Details Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more. - Rows: 1737704 - Size: 28.6 GB - Data type: text - Format: Introduction on code usage using alpaca and yaml response ### Schema The instruction alpaca text with yaml response is in the desc column: ### How to use the dataset
[ "## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1737704\n- Size: 28.6 GB\n- Data type: text\n- Format: Introduction on code usage using alpaca and yaml response", "### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:", "### How to use the dataset" ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us \n", "## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1737704\n- Size: 28.6 GB\n- Data type: text\n- Format: Introduction on code usage using alpaca and yaml response", "### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:", "### How to use the dataset" ]
4b1f797f82cce454efcc1a1c2bccfac0c6f97ed3
# Dataset Card for Evaluation run of 222gate/Blurdus-7b-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [222gate/Blurdus-7b-v0.1](https://huggingface.co/222gate/Blurdus-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:02:50.944739](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1/blob/main/results_2024-01-21T04-02-50.944739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6537793068429774, "acc_stderr": 0.03204806721727468, "acc_norm": 0.6535097790386686, "acc_norm_stderr": 0.03271036283162906, "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.6971802454568737, "mc2_stderr": 0.015138148073785463 }, "harness|arc:challenge|25": { "acc": 0.7005119453924915, "acc_stderr": 0.01338502163731357, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059376 }, "harness|hellaswag|10": { "acc": 0.7225652260505875, "acc_stderr": 0.004468178273665677, "acc_norm": 0.8849830711013742, "acc_norm_stderr": 0.0031839033919416975 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.02533120243894443, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.02533120243894443 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590172, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590172 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281365, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.013306478243066302, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.013306478243066302 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.46033519553072627, "acc_stderr": 0.016669799592112032, "acc_norm": 0.46033519553072627, "acc_norm_stderr": 0.016669799592112032 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959607, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959607 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.470013037809648, "acc_stderr": 0.01274724896707907, "acc_norm": 0.470013037809648, "acc_norm_stderr": 0.01274724896707907 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6617647058823529, "acc_stderr": 0.019139943748487043, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.019139943748487043 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174934, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174934 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233264, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233264 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197771, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197771 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5740514075887393, "mc1_stderr": 0.01731047190407654, "mc2": 0.6971802454568737, "mc2_stderr": 0.015138148073785463 }, "harness|winogrande|5": { "acc": 0.829518547750592, "acc_stderr": 0.010569021122825907 }, "harness|gsm8k|5": { "acc": 0.6785443517816527, "acc_stderr": 0.012864471384836705 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1
[ "region:us" ]
2024-01-21T04:05:09+00:00
{"pretty_name": "Evaluation run of 222gate/Blurdus-7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/Blurdus-7b-v0.1](https://huggingface.co/222gate/Blurdus-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:02:50.944739](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blurdus-7b-v0.1/blob/main/results_2024-01-21T04-02-50.944739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537793068429774,\n \"acc_stderr\": 0.03204806721727468,\n \"acc_norm\": 0.6535097790386686,\n \"acc_norm_stderr\": 0.03271036283162906,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6971802454568737,\n \"mc2_stderr\": 0.015138148073785463\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7225652260505875,\n \"acc_stderr\": 0.004468178273665677,\n \"acc_norm\": 0.8849830711013742,\n \"acc_norm_stderr\": 0.0031839033919416975\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n \"acc_stderr\": 0.016669799592112032,\n \"acc_norm\": 0.46033519553072627,\n \"acc_norm_stderr\": 0.016669799592112032\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.01274724896707907,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.01274724896707907\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174934,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174934\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6971802454568737,\n \"mc2_stderr\": 0.015138148073785463\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825907\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836705\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/Blurdus-7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["**/details_harness|winogrande|5_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-02-50.944739.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_02_50.944739", "path": ["results_2024-01-21T04-02-50.944739.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-02-50.944739.parquet"]}]}]}
2024-01-21T04:05:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 222gate/Blurdus-7b-v0.1 Dataset automatically created during the evaluation run of model 222gate/Blurdus-7b-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:02:50.944739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 222gate/Blurdus-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blurdus-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:02:50.944739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 222gate/Blurdus-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blurdus-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:02:50.944739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5412fb7513c109502bcd6ed7a75ab2dc8ab96af4
# Dataset Card for Evaluation run of NeuralNovel/Valor-7B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NeuralNovel/Valor-7B-v0.1](https://huggingface.co/NeuralNovel/Valor-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Valor-7B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:34:05.220321](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Valor-7B-v0.1/blob/main/results_2024-01-21T04-34-05.220321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.646546642495187, "acc_stderr": 0.03219553799568803, "acc_norm": 0.6461217235016347, "acc_norm_stderr": 0.03286614724425918, "mc1": 0.5605875152998776, "mc1_stderr": 0.0173745204825137, "mc2": 0.6983583637464785, "mc2_stderr": 0.015006114910118641 }, "harness|arc:challenge|25": { "acc": 0.6911262798634812, "acc_stderr": 0.013501770929344003, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059374 }, "harness|hellaswag|10": { "acc": 0.6906990639314877, "acc_stderr": 0.004612608206670406, "acc_norm": 0.8658633738299144, "acc_norm_stderr": 0.0034010255178737255 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305528, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305528 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.02302589961718872, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.02302589961718872 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603491, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603491 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6384615384615384, "acc_stderr": 0.024359581465397, "acc_norm": 0.6384615384615384, "acc_norm_stderr": 0.024359581465397 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.034076320938540516, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.034076320938540516 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676173, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676173 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.024027745155265026, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.024027745155265026 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4480446927374302, "acc_stderr": 0.016631976628930595, "acc_norm": 0.4480446927374302, "acc_norm_stderr": 0.016631976628930595 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7253086419753086, "acc_stderr": 0.024836057868294677, "acc_norm": 0.7253086419753086, "acc_norm_stderr": 0.024836057868294677 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.029752389657427047, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.029752389657427047 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4589308996088657, "acc_stderr": 0.0127270848267998, "acc_norm": 0.4589308996088657, "acc_norm_stderr": 0.0127270848267998 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396553, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396553 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.01927099870822398, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.01927099870822398 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960234, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960234 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.5605875152998776, "mc1_stderr": 0.0173745204825137, "mc2": 0.6983583637464785, "mc2_stderr": 0.015006114910118641 }, "harness|winogrande|5": { "acc": 0.8334648776637726, "acc_stderr": 0.010470796496781105 }, "harness|gsm8k|5": { "acc": 0.6914329037149356, "acc_stderr": 0.01272307604981591 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_NeuralNovel__Valor-7B-v0.1
[ "region:us" ]
2024-01-21T04:36:21+00:00
{"pretty_name": "Evaluation run of NeuralNovel/Valor-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Valor-7B-v0.1](https://huggingface.co/NeuralNovel/Valor-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Valor-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:34:05.220321](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Valor-7B-v0.1/blob/main/results_2024-01-21T04-34-05.220321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.646546642495187,\n \"acc_stderr\": 0.03219553799568803,\n \"acc_norm\": 0.6461217235016347,\n \"acc_norm_stderr\": 0.03286614724425918,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.0173745204825137,\n \"mc2\": 0.6983583637464785,\n \"mc2_stderr\": 0.015006114910118641\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344003,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6906990639314877,\n \"acc_stderr\": 0.004612608206670406,\n \"acc_norm\": 0.8658633738299144,\n \"acc_norm_stderr\": 0.0034010255178737255\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465397,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265026,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.0127270848267998,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.0127270848267998\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.01927099870822398,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.01927099870822398\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.0173745204825137,\n \"mc2\": 0.6983583637464785,\n \"mc2_stderr\": 0.015006114910118641\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6914329037149356,\n \"acc_stderr\": 0.01272307604981591\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Valor-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-34-05.220321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["**/details_harness|winogrande|5_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-34-05.220321.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_34_05.220321", "path": ["results_2024-01-21T04-34-05.220321.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-34-05.220321.parquet"]}]}]}
2024-01-21T04:36:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NeuralNovel/Valor-7B-v0.1 Dataset automatically created during the evaluation run of model NeuralNovel/Valor-7B-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:34:05.220321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of NeuralNovel/Valor-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Valor-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:34:05.220321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NeuralNovel/Valor-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Valor-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:34:05.220321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c6b2b3323fecad7b9f46dd2d62f8b3d13fef2c0b
# Dataset Card for Evaluation run of vicgalle/solarized-13B-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vicgalle/solarized-13B-dpo](https://huggingface.co/vicgalle/solarized-13B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vicgalle__solarized-13B-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:38:15.337905](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__solarized-13B-dpo/blob/main/results_2024-01-21T04-38-15.337905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5894114296811551, "acc_stderr": 0.033439242720178564, "acc_norm": 0.595668575251732, "acc_norm_stderr": 0.034144224072911684, "mc1": 0.5079559363525091, "mc1_stderr": 0.017501285074551825, "mc2": 0.6624959612962921, "mc2_stderr": 0.01569484808694598 }, "harness|arc:challenge|25": { "acc": 0.6023890784982935, "acc_stderr": 0.014301752223279536, "acc_norm": 0.6271331058020477, "acc_norm_stderr": 0.014131176760131163 }, "harness|hellaswag|10": { "acc": 0.6286596295558654, "acc_stderr": 0.004821757734156713, "acc_norm": 0.8181637124078869, "acc_norm_stderr": 0.0038492126228151687 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5481481481481482, "acc_stderr": 0.042992689054808644, "acc_norm": 0.5481481481481482, "acc_norm_stderr": 0.042992689054808644 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.02964781353936525, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.02964781353936525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.03265019475033582, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067877, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067877 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.02692344605930284, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.02692344605930284 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4433497536945813, "acc_stderr": 0.03495334582162934, "acc_norm": 0.4433497536945813, "acc_norm_stderr": 0.03495334582162934 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.558974358974359, "acc_stderr": 0.025174048384000745, "acc_norm": 0.558974358974359, "acc_norm_stderr": 0.025174048384000745 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815632, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815632 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5840336134453782, "acc_stderr": 0.03201650100739611, "acc_norm": 0.5840336134453782, "acc_norm_stderr": 0.03201650100739611 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7834862385321101, "acc_stderr": 0.017658710594443128, "acc_norm": 0.7834862385321101, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.033981108902946366, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.033981108902946366 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639318, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.02765215314415926, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.02765215314415926 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.03210062154134986, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.03210062154134986 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909456, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909456 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094632, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094632 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489122, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489122 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.02490443909891822, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.02490443909891822 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.01480538447837116, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.01480538447837116 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5953757225433526, "acc_stderr": 0.02642481659400985, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.02642481659400985 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39329608938547483, "acc_stderr": 0.01633726869427009, "acc_norm": 0.39329608938547483, "acc_norm_stderr": 0.01633726869427009 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6503267973856209, "acc_stderr": 0.027305308076274695, "acc_norm": 0.6503267973856209, "acc_norm_stderr": 0.027305308076274695 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6237942122186495, "acc_stderr": 0.02751392568354943, "acc_norm": 0.6237942122186495, "acc_norm_stderr": 0.02751392568354943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6882716049382716, "acc_stderr": 0.025773111169630453, "acc_norm": 0.6882716049382716, "acc_norm_stderr": 0.025773111169630453 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.02958345203628407, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.02958345203628407 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4302477183833116, "acc_stderr": 0.012645361435115222, "acc_norm": 0.4302477183833116, "acc_norm_stderr": 0.012645361435115222 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6013071895424836, "acc_stderr": 0.019808281317449848, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.019808281317449848 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.04653429807913508, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.04653429807913508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5387755102040817, "acc_stderr": 0.031912820526692774, "acc_norm": 0.5387755102040817, "acc_norm_stderr": 0.031912820526692774 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.030769444967296024, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.030769444967296024 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7485380116959064, "acc_stderr": 0.033275044238468436, "acc_norm": 0.7485380116959064, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.5079559363525091, "mc1_stderr": 0.017501285074551825, "mc2": 0.6624959612962921, "mc2_stderr": 0.01569484808694598 }, "harness|winogrande|5": { "acc": 0.7600631412786109, "acc_stderr": 0.012002078629485742 }, "harness|gsm8k|5": { "acc": 0.26383623957543595, "acc_stderr": 0.012139386425126806 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vicgalle__solarized-13B-dpo
[ "region:us" ]
2024-01-21T04:40:32+00:00
{"pretty_name": "Evaluation run of vicgalle/solarized-13B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/solarized-13B-dpo](https://huggingface.co/vicgalle/solarized-13B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__solarized-13B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:38:15.337905](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__solarized-13B-dpo/blob/main/results_2024-01-21T04-38-15.337905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5894114296811551,\n \"acc_stderr\": 0.033439242720178564,\n \"acc_norm\": 0.595668575251732,\n \"acc_norm_stderr\": 0.034144224072911684,\n \"mc1\": 0.5079559363525091,\n \"mc1_stderr\": 0.017501285074551825,\n \"mc2\": 0.6624959612962921,\n \"mc2_stderr\": 0.01569484808694598\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279536,\n \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6286596295558654,\n \"acc_stderr\": 0.004821757734156713,\n \"acc_norm\": 0.8181637124078869,\n \"acc_norm_stderr\": 0.0038492126228151687\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n \"acc_stderr\": 0.02490443909891822,\n \"acc_norm\": 0.8247863247863247,\n \"acc_norm_stderr\": 0.02490443909891822\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.01480538447837116,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.01480538447837116\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.02642481659400985,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.02642481659400985\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n \"acc_stderr\": 0.01633726869427009,\n \"acc_norm\": 0.39329608938547483,\n \"acc_norm_stderr\": 0.01633726869427009\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630453,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630453\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n \"acc_stderr\": 0.012645361435115222,\n \"acc_norm\": 0.4302477183833116,\n \"acc_norm_stderr\": 0.012645361435115222\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.019808281317449848,\n \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.019808281317449848\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296024,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5079559363525091,\n \"mc1_stderr\": 0.017501285074551825,\n \"mc2\": 0.6624959612962921,\n \"mc2_stderr\": 0.01569484808694598\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485742\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26383623957543595,\n \"acc_stderr\": 0.012139386425126806\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/solarized-13B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-38-15.337905.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["**/details_harness|winogrande|5_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-38-15.337905.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_38_15.337905", "path": ["results_2024-01-21T04-38-15.337905.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-38-15.337905.parquet"]}]}]}
2024-01-21T04:40:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vicgalle/solarized-13B-dpo Dataset automatically created during the evaluation run of model vicgalle/solarized-13B-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:38:15.337905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vicgalle/solarized-13B-dpo\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/solarized-13B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:38:15.337905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vicgalle/solarized-13B-dpo\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/solarized-13B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:38:15.337905(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
88f7d8e4ece8f19ce7e987ce71b0a3e9b7f2f5b9
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.7 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.7](https://huggingface.co/andysalerno/openchat-nectar-0.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:44:01.094706](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7/blob/main/results_2024-01-21T04-44-01.094706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6533452700527564, "acc_stderr": 0.03187068672960971, "acc_norm": 0.654124427390922, "acc_norm_stderr": 0.032524509376303544, "mc1": 0.35495716034271724, "mc1_stderr": 0.016750862381375898, "mc2": 0.5204520312017102, "mc2_stderr": 0.015323853661186408 }, "harness|arc:challenge|25": { "acc": 0.6228668941979523, "acc_stderr": 0.014163366896192598, "acc_norm": 0.6578498293515358, "acc_norm_stderr": 0.013864152159177275 }, "harness|hellaswag|10": { "acc": 0.6334395538737303, "acc_stderr": 0.004808802114592841, "acc_norm": 0.8300139414459271, "acc_norm_stderr": 0.0037485288878381247 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.035868792800803406, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.035868792800803406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.04943110704237101, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237101 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.03533133389323657, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.03533133389323657 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554963, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554963 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494562, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494562 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6692307692307692, "acc_stderr": 0.023854795680971118, "acc_norm": 0.6692307692307692, "acc_norm_stderr": 0.023854795680971118 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579647, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579647 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.03050028317654585, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.03050028317654585 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243839, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8365261813537676, "acc_stderr": 0.013223928616741624, "acc_norm": 0.8365261813537676, "acc_norm_stderr": 0.013223928616741624 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7601156069364162, "acc_stderr": 0.022989592543123563, "acc_norm": 0.7601156069364162, "acc_norm_stderr": 0.022989592543123563 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859933, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859933 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7654320987654321, "acc_stderr": 0.02357688174400572, "acc_norm": 0.7654320987654321, "acc_norm_stderr": 0.02357688174400572 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.02975238965742705, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.02975238965742705 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4895697522816167, "acc_stderr": 0.012767457253930643, "acc_norm": 0.4895697522816167, "acc_norm_stderr": 0.012767457253930643 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7279411764705882, "acc_stderr": 0.02703304115168146, "acc_norm": 0.7279411764705882, "acc_norm_stderr": 0.02703304115168146 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.01890101532209309, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.01890101532209309 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174937, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174937 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578334, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578334 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197768, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197768 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072767, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072767 }, "harness|truthfulqa:mc|0": { "mc1": 0.35495716034271724, "mc1_stderr": 0.016750862381375898, "mc2": 0.5204520312017102, "mc2_stderr": 0.015323853661186408 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.01094187795567621 }, "harness|gsm8k|5": { "acc": 0.6785443517816527, "acc_stderr": 0.012864471384836703 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7
[ "region:us" ]
2024-01-21T04:46:20+00:00
{"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.7", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.7](https://huggingface.co/andysalerno/openchat-nectar-0.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:44:01.094706](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7/blob/main/results_2024-01-21T04-44-01.094706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533452700527564,\n \"acc_stderr\": 0.03187068672960971,\n \"acc_norm\": 0.654124427390922,\n \"acc_norm_stderr\": 0.032524509376303544,\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5204520312017102,\n \"mc2_stderr\": 0.015323853661186408\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192598,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6334395538737303,\n \"acc_stderr\": 0.004808802114592841,\n \"acc_norm\": 0.8300139414459271,\n \"acc_norm_stderr\": 0.0037485288878381247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859933,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859933\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400572,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400572\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n \"acc_stderr\": 0.012767457253930643,\n \"acc_norm\": 0.4895697522816167,\n \"acc_norm_stderr\": 0.012767457253930643\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5204520312017102,\n \"mc2_stderr\": 0.015323853661186408\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836703\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["**/details_harness|winogrande|5_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-44-01.094706.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_44_01.094706", "path": ["results_2024-01-21T04-44-01.094706.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-44-01.094706.parquet"]}]}]}
2024-01-21T04:46:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.7 Dataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.7 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:44:01.094706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.7\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:44:01.094706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.7\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:44:01.094706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
234339eb7c97688aab0d553d4ae34e8a113220c3
# Dataset Card for Evaluation run of 222gate/Blur-4x7b-MOE-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [222gate/Blur-4x7b-MOE-v0.1](https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:45:19.432784](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1/blob/main/results_2024-01-21T04-45-19.432784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6559352269047928, "acc_stderr": 0.03199697744913013, "acc_norm": 0.6556619411144256, "acc_norm_stderr": 0.03265823717914628, "mc1": 0.5458996328029376, "mc1_stderr": 0.017429593091323515, "mc2": 0.688217063142182, "mc2_stderr": 0.01518842298057346 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068749, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059376 }, "harness|hellaswag|10": { "acc": 0.7159928301135232, "acc_stderr": 0.004500186424443795, "acc_norm": 0.8813981278629756, "acc_norm_stderr": 0.003226586783421294 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175007, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.03077805742293167, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.03077805742293167 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461766, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461766 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.024857478080250454, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.024857478080250454 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.013306478243066302, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.013306478243066302 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7485549132947977, "acc_stderr": 0.02335736578587403, "acc_norm": 0.7485549132947977, "acc_norm_stderr": 0.02335736578587403 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4301675977653631, "acc_stderr": 0.01655860163604104, "acc_norm": 0.4301675977653631, "acc_norm_stderr": 0.01655860163604104 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959607, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959607 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5035460992907801, "acc_stderr": 0.02982674915328092, "acc_norm": 0.5035460992907801, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657473, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657473 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.019117213911495148, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.019117213911495148 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233264, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233264 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5458996328029376, "mc1_stderr": 0.017429593091323515, "mc2": 0.688217063142182, "mc2_stderr": 0.01518842298057346 }, "harness|winogrande|5": { "acc": 0.8255722178374112, "acc_stderr": 0.010665187902498435 }, "harness|gsm8k|5": { "acc": 0.689158453373768, "acc_stderr": 0.012748860507777725 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1
[ "region:us" ]
2024-01-21T04:47:38+00:00
{"pretty_name": "Evaluation run of 222gate/Blur-4x7b-MOE-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/Blur-4x7b-MOE-v0.1](https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:45:19.432784](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__Blur-4x7b-MOE-v0.1/blob/main/results_2024-01-21T04-45-19.432784.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559352269047928,\n \"acc_stderr\": 0.03199697744913013,\n \"acc_norm\": 0.6556619411144256,\n \"acc_norm_stderr\": 0.03265823717914628,\n \"mc1\": 0.5458996328029376,\n \"mc1_stderr\": 0.017429593091323515,\n \"mc2\": 0.688217063142182,\n \"mc2_stderr\": 0.01518842298057346\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068749,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7159928301135232,\n \"acc_stderr\": 0.004500186424443795,\n \"acc_norm\": 0.8813981278629756,\n \"acc_norm_stderr\": 0.003226586783421294\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5458996328029376,\n \"mc1_stderr\": 0.017429593091323515,\n \"mc2\": 0.688217063142182,\n \"mc2_stderr\": 0.01518842298057346\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498435\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \"acc_stderr\": 0.012748860507777725\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/Blur-4x7b-MOE-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["**/details_harness|winogrande|5_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-45-19.432784.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_45_19.432784", "path": ["results_2024-01-21T04-45-19.432784.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-45-19.432784.parquet"]}]}]}
2024-01-21T04:48:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 222gate/Blur-4x7b-MOE-v0.1 Dataset automatically created during the evaluation run of model 222gate/Blur-4x7b-MOE-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:45:19.432784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 222gate/Blur-4x7b-MOE-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blur-4x7b-MOE-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:45:19.432784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 222gate/Blur-4x7b-MOE-v0.1\n\n\n\nDataset automatically created during the evaluation run of model 222gate/Blur-4x7b-MOE-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:45:19.432784(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0ea34a6dcbe84784f83aec0ff8a732c7e0cd2f48
# Dataset Card for Evaluation run of NeuralNovel/Ember-7B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NeuralNovel/Ember-7B-v0.1](https://huggingface.co/NeuralNovel/Ember-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T04:54:30.326660](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1/blob/main/results_2024-01-21T04-54-30.326660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6350177574762061, "acc_stderr": 0.03209396150318776, "acc_norm": 0.6453628417891409, "acc_norm_stderr": 0.032876548477357756, "mc1": 0.46511627906976744, "mc1_stderr": 0.017460849975873965, "mc2": 0.6328773143242248, "mc2_stderr": 0.015426240628860234 }, "harness|arc:challenge|25": { "acc": 0.6569965870307167, "acc_stderr": 0.01387242322371817, "acc_norm": 0.6843003412969283, "acc_norm_stderr": 0.013582571095815293 }, "harness|hellaswag|10": { "acc": 0.6719776936865166, "acc_stderr": 0.004685334844038663, "acc_norm": 0.8552081258713403, "acc_norm_stderr": 0.003511717085451996 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998905, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998905 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7283018867924528, "acc_stderr": 0.027377706624670713, "acc_norm": 0.7283018867924528, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778398, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181012, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181012 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121427, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121427 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652457, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652457 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059278, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059278 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.02704462171947409, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.02704462171947409 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.043546310772605956, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.043546310772605956 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092368, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092368 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39888268156424583, "acc_stderr": 0.016376966142610076, "acc_norm": 0.39888268156424583, "acc_norm_stderr": 0.016376966142610076 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7363344051446945, "acc_stderr": 0.02502553850053234, "acc_norm": 0.7363344051446945, "acc_norm_stderr": 0.02502553850053234 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.012740853872949834, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.012740853872949834 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6552287581699346, "acc_stderr": 0.019228322018696644, "acc_norm": 0.6552287581699346, "acc_norm_stderr": 0.019228322018696644 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.027979823538744546, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.027979823538744546 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.46511627906976744, "mc1_stderr": 0.017460849975873965, "mc2": 0.6328773143242248, "mc2_stderr": 0.015426240628860234 }, "harness|winogrande|5": { "acc": 0.8232044198895028, "acc_stderr": 0.010721923287918747 }, "harness|gsm8k|5": { "acc": 0.04700530705079606, "acc_stderr": 0.0058298983559371955 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1
[ "region:us" ]
2024-01-21T04:56:45+00:00
{"pretty_name": "Evaluation run of NeuralNovel/Ember-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Ember-7B-v0.1](https://huggingface.co/NeuralNovel/Ember-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T04:54:30.326660](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Ember-7B-v0.1/blob/main/results_2024-01-21T04-54-30.326660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350177574762061,\n \"acc_stderr\": 0.03209396150318776,\n \"acc_norm\": 0.6453628417891409,\n \"acc_norm_stderr\": 0.032876548477357756,\n \"mc1\": 0.46511627906976744,\n \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6328773143242248,\n \"mc2_stderr\": 0.015426240628860234\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.01387242322371817,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815293\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n \"acc_stderr\": 0.004685334844038663,\n \"acc_norm\": 0.8552081258713403,\n \"acc_norm_stderr\": 0.003511717085451996\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610076,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6328773143242248,\n \"mc2_stderr\": 0.015426240628860234\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.0058298983559371955\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Ember-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["**/details_harness|winogrande|5_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T04-54-30.326660.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T04_54_30.326660", "path": ["results_2024-01-21T04-54-30.326660.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T04-54-30.326660.parquet"]}]}]}
2024-01-21T04:57:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NeuralNovel/Ember-7B-v0.1 Dataset automatically created during the evaluation run of model NeuralNovel/Ember-7B-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T04:54:30.326660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of NeuralNovel/Ember-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Ember-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:54:30.326660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NeuralNovel/Ember-7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Ember-7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T04:54:30.326660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0ea2ca375fb64f395d4b2a897ddd5eb58c186c12
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:33:56.046720](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1/blob/main/results_2024-01-21T05-33-56.046720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5886798841868525, "acc_stderr": 0.033481177434210675, "acc_norm": 0.5934525981229489, "acc_norm_stderr": 0.034167981418467, "mc1": 0.4700122399020808, "mc1_stderr": 0.017471992091697537, "mc2": 0.6312877411374193, "mc2_stderr": 0.015524870393458118 }, "harness|arc:challenge|25": { "acc": 0.5435153583617748, "acc_stderr": 0.014555949760496444, "acc_norm": 0.6015358361774744, "acc_norm_stderr": 0.014306946052735567 }, "harness|hellaswag|10": { "acc": 0.6310495917147978, "acc_stderr": 0.004815343349305216, "acc_norm": 0.8259310894244174, "acc_norm_stderr": 0.0037839381501516165 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5333333333333333, "acc_stderr": 0.043097329010363554, "acc_norm": 0.5333333333333333, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5921052631578947, "acc_stderr": 0.039993097127774734, "acc_norm": 0.5921052631578947, "acc_norm_stderr": 0.039993097127774734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6415094339622641, "acc_stderr": 0.029514703583981765, "acc_norm": 0.6415094339622641, "acc_norm_stderr": 0.029514703583981765 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6736111111111112, "acc_stderr": 0.03921067198982266, "acc_norm": 0.6736111111111112, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5664739884393064, "acc_stderr": 0.037786210790920566, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.037786210790920566 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105654, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.502127659574468, "acc_stderr": 0.03268572658667492, "acc_norm": 0.502127659574468, "acc_norm_stderr": 0.03268572658667492 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36243386243386244, "acc_stderr": 0.02475747390275206, "acc_norm": 0.36243386243386244, "acc_norm_stderr": 0.02475747390275206 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6580645161290323, "acc_stderr": 0.026985289576552735, "acc_norm": 0.6580645161290323, "acc_norm_stderr": 0.026985289576552735 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5369458128078818, "acc_stderr": 0.035083705204426656, "acc_norm": 0.5369458128078818, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6909090909090909, "acc_stderr": 0.036085410115739666, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.036085410115739666 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153303, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153303 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5512820512820513, "acc_stderr": 0.025217315184846482, "acc_norm": 0.5512820512820513, "acc_norm_stderr": 0.025217315184846482 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5966386554621849, "acc_stderr": 0.031866081214088314, "acc_norm": 0.5966386554621849, "acc_norm_stderr": 0.031866081214088314 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7779816513761468, "acc_stderr": 0.017818849564796648, "acc_norm": 0.7779816513761468, "acc_norm_stderr": 0.017818849564796648 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.47685185185185186, "acc_stderr": 0.03406315360711507, "acc_norm": 0.47685185185185186, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467766, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467766 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503948, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503948 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.023636873317489294, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.023636873317489294 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7662835249042146, "acc_stderr": 0.01513338327898883, "acc_norm": 0.7662835249042146, "acc_norm_stderr": 0.01513338327898883 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.025416003773165545, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.025416003773165545 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30726256983240224, "acc_stderr": 0.015430158846469613, "acc_norm": 0.30726256983240224, "acc_norm_stderr": 0.015430158846469613 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6764705882352942, "acc_stderr": 0.0267874531119065, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.0267874531119065 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6591639871382636, "acc_stderr": 0.026920841260776162, "acc_norm": 0.6591639871382636, "acc_norm_stderr": 0.026920841260776162 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6666666666666666, "acc_stderr": 0.02622964917882116, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.02622964917882116 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4335071707953064, "acc_stderr": 0.012656810383983965, "acc_norm": 0.4335071707953064, "acc_norm_stderr": 0.012656810383983965 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5772058823529411, "acc_stderr": 0.030008562845003476, "acc_norm": 0.5772058823529411, "acc_norm_stderr": 0.030008562845003476 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6078431372549019, "acc_stderr": 0.019751726508762637, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.019751726508762637 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.02904308868330433, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.02904308868330433 }, "harness|hendrycksTest-sociology|5": { "acc": 0.746268656716418, "acc_stderr": 0.030769444967296018, "acc_norm": 0.746268656716418, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.0312678171466318, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.4700122399020808, "mc1_stderr": 0.017471992091697537, "mc2": 0.6312877411374193, "mc2_stderr": 0.015524870393458118 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.011807360224025391 }, "harness|gsm8k|5": { "acc": 0.3775587566338135, "acc_stderr": 0.013353150666358539 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1
[ "region:us" ]
2024-01-21T05:30:14+00:00
{"pretty_name": "Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:33:56.046720](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1/blob/main/results_2024-01-21T05-33-56.046720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5886798841868525,\n \"acc_stderr\": 0.033481177434210675,\n \"acc_norm\": 0.5934525981229489,\n \"acc_norm_stderr\": 0.034167981418467,\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6312877411374193,\n \"mc2_stderr\": 0.015524870393458118\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496444,\n \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735567\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6310495917147978,\n \"acc_stderr\": 0.004815343349305216,\n \"acc_norm\": 0.8259310894244174,\n \"acc_norm_stderr\": 0.0037839381501516165\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981765,\n \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981765\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552735,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796648,\n \"acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n \"acc_stderr\": 0.015430158846469613,\n \"acc_norm\": 0.30726256983240224,\n \"acc_norm_stderr\": 0.015430158846469613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697537,\n \"mc2\": 0.6312877411374193,\n \"mc2_stderr\": 0.015524870393458118\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025391\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3775587566338135,\n \"acc_stderr\": 0.013353150666358539\n }\n}\n```", "repo_url": "https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-27-51.994355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["**/details_harness|winogrande|5_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["**/details_harness|winogrande|5_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-33-56.046720.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_27_51.994355", "path": ["results_2024-01-21T05-27-51.994355.parquet"]}, {"split": "2024_01_21T05_33_56.046720", "path": ["results_2024-01-21T05-33-56.046720.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-33-56.046720.parquet"]}]}]}
2024-01-21T05:36:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1 Dataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:33:56.046720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:33:56.046720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:33:56.046720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c498d3d56a7c023f8928978363a30b98c0aadfe0
# Dataset Card for "speech_commands_extract_unit" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/speech_commands_extract_unit
[ "region:us" ]
2024-01-21T05:36:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 105709368, "num_examples": 64727}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 105709368, "num_examples": 64727}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 157026808, "num_examples": 64727}, {"name": "audiodec_24k_320d", "num_bytes": 332800616, "num_examples": 64727}, {"name": "dac_16k", "num_bytes": 315363192, "num_examples": 64727}, {"name": "dac_24k", "num_bytes": 1245776776, "num_examples": 64727}, {"name": "dac_44k", "num_bytes": 406863564, "num_examples": 64727}, {"name": "encodec_24k", "num_bytes": 80640400, "num_examples": 64727}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 846964616, "num_examples": 64727}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 846964616, "num_examples": 64727}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 846962568, "num_examples": 64727}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 436754312, "num_examples": 64727}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 846962568, "num_examples": 64727}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 436754312, "num_examples": 64727}, {"name": "speech_tokenizer_16k", "num_bytes": 213377832, "num_examples": 64727}], "download_size": 1145201889, "dataset_size": 7224630916}}
2024-01-21T05:39:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "speech_commands_extract_unit" More Information needed
[ "# Dataset Card for \"speech_commands_extract_unit\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"speech_commands_extract_unit\"\n\nMore Information needed" ]
a2d7f7d74286594e63b9b3a80e54c275455667bd
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:43:59.108748](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2/blob/main/results_2024-01-21T05-43-59.108748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5937268860799231, "acc_stderr": 0.03339486970483276, "acc_norm": 0.5987481844006874, "acc_norm_stderr": 0.034078201677495076, "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6269642246460232, "mc2_stderr": 0.01559496631642023 }, "harness|arc:challenge|25": { "acc": 0.5392491467576792, "acc_stderr": 0.014566303676636583, "acc_norm": 0.5947098976109215, "acc_norm_stderr": 0.014346869060229311 }, "harness|hellaswag|10": { "acc": 0.6337382991435969, "acc_stderr": 0.0048079755154464875, "acc_norm": 0.8272256522605059, "acc_norm_stderr": 0.003772794447185149 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5921052631578947, "acc_stderr": 0.039993097127774734, "acc_norm": 0.5921052631578947, "acc_norm_stderr": 0.039993097127774734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6339622641509434, "acc_stderr": 0.029647813539365242, "acc_norm": 0.6339622641509434, "acc_norm_stderr": 0.029647813539365242 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5491329479768786, "acc_stderr": 0.0379401267469703, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.0379401267469703 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.04082482904638629, "acc_norm": 0.6, "acc_norm_stderr": 0.04082482904638629 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35978835978835977, "acc_stderr": 0.02471807594412928, "acc_norm": 0.35978835978835977, "acc_norm_stderr": 0.02471807594412928 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939098, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091707, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091707 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790486, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790486 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397433, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.02515826601686858, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.02515826601686858 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5798319327731093, "acc_stderr": 0.03206183783236152, "acc_norm": 0.5798319327731093, "acc_norm_stderr": 0.03206183783236152 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7908256880733945, "acc_stderr": 0.017437937173343233, "acc_norm": 0.7908256880733945, "acc_norm_stderr": 0.017437937173343233 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604246, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.02782078198114969, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.02782078198114969 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.042365112580946336, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.768837803320562, "acc_stderr": 0.015075523238101081, "acc_norm": 0.768837803320562, "acc_norm_stderr": 0.015075523238101081 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6473988439306358, "acc_stderr": 0.025722802200895817, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.025722802200895817 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3229050279329609, "acc_stderr": 0.015638440380241488, "acc_norm": 0.3229050279329609, "acc_norm_stderr": 0.015638440380241488 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6503267973856209, "acc_stderr": 0.027305308076274702, "acc_norm": 0.6503267973856209, "acc_norm_stderr": 0.027305308076274702 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6495176848874598, "acc_stderr": 0.02709865262130175, "acc_norm": 0.6495176848874598, "acc_norm_stderr": 0.02709865262130175 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.02604176620271716, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.02604176620271716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42698826597131684, "acc_stderr": 0.012633353557534425, "acc_norm": 0.42698826597131684, "acc_norm_stderr": 0.012633353557534425 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5625, "acc_stderr": 0.030134614954403924, "acc_norm": 0.5625, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6045751633986928, "acc_stderr": 0.01978046595477751, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.01978046595477751 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6269642246460232, "mc2_stderr": 0.01559496631642023 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183525 }, "harness|gsm8k|5": { "acc": 0.3737680060652009, "acc_stderr": 0.013326342860737021 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2
[ "region:us" ]
2024-01-21T05:37:17+00:00
{"pretty_name": "Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2", "dataset_summary": "Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:43:59.108748](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2/blob/main/results_2024-01-21T05-43-59.108748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5937268860799231,\n \"acc_stderr\": 0.03339486970483276,\n \"acc_norm\": 0.5987481844006874,\n \"acc_norm_stderr\": 0.034078201677495076,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6269642246460232,\n \"mc2_stderr\": 0.01559496631642023\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636583,\n \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229311\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6337382991435969,\n \"acc_stderr\": 0.0048079755154464875,\n \"acc_norm\": 0.8272256522605059,\n \"acc_norm_stderr\": 0.003772794447185149\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35978835978835977,\n \"acc_stderr\": 0.02471807594412928,\n \"acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.02471807594412928\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397433,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101081,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101081\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n \"acc_stderr\": 0.015638440380241488,\n \"acc_norm\": 0.3229050279329609,\n \"acc_norm_stderr\": 0.015638440380241488\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274702,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274702\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.01978046595477751,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.01978046595477751\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6269642246460232,\n \"mc2_stderr\": 0.01559496631642023\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3737680060652009,\n \"acc_stderr\": 0.013326342860737021\n }\n}\n```", "repo_url": "https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-34-58.151174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-43-59.108748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["**/details_harness|winogrande|5_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["**/details_harness|winogrande|5_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-43-59.108748.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_34_58.151174", "path": ["results_2024-01-21T05-34-58.151174.parquet"]}, {"split": "2024_01_21T05_43_59.108748", "path": ["results_2024-01-21T05-43-59.108748.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-43-59.108748.parquet"]}]}]}
2024-01-21T05:46:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2 Dataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:43:59.108748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:43:59.108748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:43:59.108748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
d8f587b90505c12243316291b0ff332a08f0cd69
# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vicgalle/NeuralBeagle-11B](https://huggingface.co/vicgalle/NeuralBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:36:08.681056](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B/blob/main/results_2024-01-21T05-36-08.681056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6427615979936613, "acc_stderr": 0.032364857864304215, "acc_norm": 0.6436873631017901, "acc_norm_stderr": 0.03302566377855622, "mc1": 0.587515299877601, "mc1_stderr": 0.017233299399571207, "mc2": 0.7135848334628236, "mc2_stderr": 0.015017949998169619 }, "harness|arc:challenge|25": { "acc": 0.7107508532423208, "acc_stderr": 0.013250012579393443, "acc_norm": 0.7329351535836177, "acc_norm_stderr": 0.012928933196496357 }, "harness|hellaswag|10": { "acc": 0.7130053774148576, "acc_stderr": 0.004514345547780333, "acc_norm": 0.8761202947619996, "acc_norm_stderr": 0.0032877097411288 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.038424985593952694, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.038424985593952694 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569526, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569526 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.02533120243894443, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.02533120243894443 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.024035489676335082, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.024035489676335082 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.02616056824660146, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.02616056824660146 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.0306365913486998, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.0306365913486998 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.038498560987940904, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.038498560987940904 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.01346820161406629, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.01346820161406629 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.02454761779480383, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.02454761779480383 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4402234636871508, "acc_stderr": 0.01660256461504994, "acc_norm": 0.4402234636871508, "acc_norm_stderr": 0.01660256461504994 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.02609016250427905, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.02600330111788514, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765137, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.02904308868330434, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.02904308868330434 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233268, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233268 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.587515299877601, "mc1_stderr": 0.017233299399571207, "mc2": 0.7135848334628236, "mc2_stderr": 0.015017949998169619 }, "harness|winogrande|5": { "acc": 0.8263614838200474, "acc_stderr": 0.010646116480330994 }, "harness|gsm8k|5": { "acc": 0.5898407884761183, "acc_stderr": 0.013548335117860341 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B
[ "region:us" ]
2024-01-21T05:38:27+00:00
{"pretty_name": "Evaluation run of vicgalle/NeuralBeagle-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/NeuralBeagle-11B](https://huggingface.co/vicgalle/NeuralBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:36:08.681056](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B/blob/main/results_2024-01-21T05-36-08.681056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427615979936613,\n \"acc_stderr\": 0.032364857864304215,\n \"acc_norm\": 0.6436873631017901,\n \"acc_norm_stderr\": 0.03302566377855622,\n \"mc1\": 0.587515299877601,\n \"mc1_stderr\": 0.017233299399571207,\n \"mc2\": 0.7135848334628236,\n \"mc2_stderr\": 0.015017949998169619\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393443,\n \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496357\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7130053774148576,\n \"acc_stderr\": 0.004514345547780333,\n \"acc_norm\": 0.8761202947619996,\n \"acc_norm_stderr\": 0.0032877097411288\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.01346820161406629,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.01346820161406629\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.02454761779480383,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.02454761779480383\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330434,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330434\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.587515299877601,\n \"mc1_stderr\": 0.017233299399571207,\n \"mc2\": 0.7135848334628236,\n \"mc2_stderr\": 0.015017949998169619\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330994\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5898407884761183,\n \"acc_stderr\": 0.013548335117860341\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/NeuralBeagle-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-36-08.681056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["**/details_harness|winogrande|5_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-36-08.681056.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_36_08.681056", "path": ["results_2024-01-21T05-36-08.681056.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-36-08.681056.parquet"]}]}]}
2024-01-21T05:39:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B Dataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:36:08.681056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:36:08.681056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:36:08.681056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
87b78f8b35b721d5bb6a41bf7123263e55fc7426
This dataset was generated by reformatting [`coref-data/gum_raw`](https://huggingface.co/datasets/coref-data/gum_raw) into the indiscrim coreference format. See that repo for dataset details. See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
coref-data/gum_indiscrim
[ "region:us" ]
2024-01-21T05:39:15+00:00
{"dataset_info": [{"config_name": "ontogum", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "deps", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "int64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 23472505, "num_examples": 165}, {"name": "validation", "num_bytes": 3119527, "num_examples": 24}, {"name": "test", "num_bytes": 3180699, "num_examples": 24}], "download_size": 7424694, "dataset_size": 29772731}, {"config_name": "original", "features": [{"name": "sentences", "list": [{"name": "id", "dtype": "int64"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "tokens", "list": [{"name": "deprel", "dtype": "string"}, {"name": "feats", "dtype": "string"}, {"name": "head", "dtype": "int64"}, {"name": "id", "dtype": "float64"}, {"name": "lemma", "dtype": "string"}, {"name": "misc", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "upos", "dtype": "string"}, {"name": "xpos", "dtype": "string"}]}, {"name": "misc", "struct": [{"name": "parse_tree", "dtype": "string"}]}]}, {"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "coref_chains", "sequence": {"sequence": {"sequence": "int64"}}}, {"name": "genre", "dtype": "string"}, {"name": "meta_data", "struct": [{"name": "comment", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 22369183, "num_examples": 165}, {"name": "validation", "num_bytes": 2970347, "num_examples": 24}, {"name": "test", "num_bytes": 3038551, "num_examples": 24}], "download_size": 7048887, "dataset_size": 28378081}], "configs": [{"config_name": "ontogum", "data_files": [{"split": "train", "path": "ontogum/train-*"}, {"split": "validation", "path": "ontogum/validation-*"}, {"split": "test", "path": "ontogum/test-*"}]}, {"config_name": "original", "data_files": [{"split": "train", "path": "original/train-*"}, {"split": "validation", "path": "original/validation-*"}, {"split": "test", "path": "original/test-*"}]}]}
2024-02-13T03:46:14+00:00
[]
[]
TAGS #region-us
This dataset was generated by reformatting 'coref-data/gum_raw' into the indiscrim coreference format. See that repo for dataset details. See ianporada/coref-data for additional conversion details and the conversion script. Please create an issue in the repo above or in this dataset repo for any questions.
[]
[ "TAGS\n#region-us \n" ]
80c239e349c81c276dd1c35e2232d3f3d72310c1
# Dataset Card for Evaluation run of Locutusque/UltraQwen-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Locutusque/UltraQwen-7B](https://huggingface.co/Locutusque/UltraQwen-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Locutusque__UltraQwen-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:39:45.289395](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__UltraQwen-7B/blob/main/results_2024-01-21T05-39-45.289395.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5898728989909359, "acc_stderr": 0.033120537181371056, "acc_norm": 0.5935424485170769, "acc_norm_stderr": 0.033788814320385836, "mc1": 0.33414932680538556, "mc1_stderr": 0.016512530677150528, "mc2": 0.4820049087562541, "mc2_stderr": 0.015078488124942854 }, "harness|arc:challenge|25": { "acc": 0.5051194539249146, "acc_stderr": 0.014610624890309157, "acc_norm": 0.5170648464163823, "acc_norm_stderr": 0.014602878388536598 }, "harness|hellaswag|10": { "acc": 0.5744871539533958, "acc_stderr": 0.004934100774481224, "acc_norm": 0.7793268273252341, "acc_norm_stderr": 0.004138529919075828 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.038424985593952694, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.038424985593952694 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796005, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404947, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.03268335899936338, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.03268335899936338 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.043391383225798615, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.043391383225798615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.455026455026455, "acc_stderr": 0.025646928361049398, "acc_norm": 0.455026455026455, "acc_norm_stderr": 0.025646928361049398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7096774193548387, "acc_stderr": 0.025822106119415895, "acc_norm": 0.7096774193548387, "acc_norm_stderr": 0.025822106119415895 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624336, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386417, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386417 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153317, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153317 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5794871794871795, "acc_stderr": 0.025028610276710855, "acc_norm": 0.5794871794871795, "acc_norm_stderr": 0.025028610276710855 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5882352941176471, "acc_stderr": 0.03196876989195778, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.03196876989195778 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242741, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242741 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7834862385321101, "acc_stderr": 0.017658710594443128, "acc_norm": 0.7834862385321101, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.03372343271653064, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.03372343271653064 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416828, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416828 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467765, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467765 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6809815950920245, "acc_stderr": 0.03661997551073836, "acc_norm": 0.6809815950920245, "acc_norm_stderr": 0.03661997551073836 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7713920817369093, "acc_stderr": 0.015016884698539883, "acc_norm": 0.7713920817369093, "acc_norm_stderr": 0.015016884698539883 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6560693641618497, "acc_stderr": 0.025574123786546655, "acc_norm": 0.6560693641618497, "acc_norm_stderr": 0.025574123786546655 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3039106145251397, "acc_stderr": 0.015382845587584525, "acc_norm": 0.3039106145251397, "acc_norm_stderr": 0.015382845587584525 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6601307189542484, "acc_stderr": 0.027121956071388852, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.027121956071388852 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6655948553054662, "acc_stderr": 0.026795422327893934, "acc_norm": 0.6655948553054662, "acc_norm_stderr": 0.026795422327893934 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6666666666666666, "acc_stderr": 0.02622964917882116, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.02622964917882116 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.02952591430255856, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.02952591430255856 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42503259452411996, "acc_stderr": 0.012625879884892003, "acc_norm": 0.42503259452411996, "acc_norm_stderr": 0.012625879884892003 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5441176470588235, "acc_stderr": 0.030254372573976722, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.030254372573976722 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5882352941176471, "acc_stderr": 0.019910377463105932, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.019910377463105932 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.04653429807913507, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.04653429807913507 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291286, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291286 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7309941520467836, "acc_stderr": 0.03401052620104089, "acc_norm": 0.7309941520467836, "acc_norm_stderr": 0.03401052620104089 }, "harness|truthfulqa:mc|0": { "mc1": 0.33414932680538556, "mc1_stderr": 0.016512530677150528, "mc2": 0.4820049087562541, "mc2_stderr": 0.015078488124942854 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.01233483367199829 }, "harness|gsm8k|5": { "acc": 0.4404852160727824, "acc_stderr": 0.013674572131693888 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Locutusque__UltraQwen-7B
[ "region:us" ]
2024-01-21T05:41:50+00:00
{"pretty_name": "Evaluation run of Locutusque/UltraQwen-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/UltraQwen-7B](https://huggingface.co/Locutusque/UltraQwen-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__UltraQwen-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:39:45.289395](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__UltraQwen-7B/blob/main/results_2024-01-21T05-39-45.289395.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5898728989909359,\n \"acc_stderr\": 0.033120537181371056,\n \"acc_norm\": 0.5935424485170769,\n \"acc_norm_stderr\": 0.033788814320385836,\n \"mc1\": 0.33414932680538556,\n \"mc1_stderr\": 0.016512530677150528,\n \"mc2\": 0.4820049087562541,\n \"mc2_stderr\": 0.015078488124942854\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536598\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5744871539533958,\n \"acc_stderr\": 0.004934100774481224,\n \"acc_norm\": 0.7793268273252341,\n \"acc_norm_stderr\": 0.004138529919075828\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936338,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936338\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153317,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153317\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n \"acc_stderr\": 0.015016884698539883,\n \"acc_norm\": 0.7713920817369093,\n \"acc_norm_stderr\": 0.015016884698539883\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546655,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546655\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584525,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584525\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n \"acc_stderr\": 0.012625879884892003,\n \"acc_norm\": 0.42503259452411996,\n \"acc_norm_stderr\": 0.012625879884892003\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976722,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976722\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n \"mc1_stderr\": 0.016512530677150528,\n \"mc2\": 0.4820049087562541,\n \"mc2_stderr\": 0.015078488124942854\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.01233483367199829\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4404852160727824,\n \"acc_stderr\": 0.013674572131693888\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/UltraQwen-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-39-45.289395.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["**/details_harness|winogrande|5_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-39-45.289395.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_39_45.289395", "path": ["results_2024-01-21T05-39-45.289395.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-39-45.289395.parquet"]}]}]}
2024-01-21T05:42:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Locutusque/UltraQwen-7B Dataset automatically created during the evaluation run of model Locutusque/UltraQwen-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:39:45.289395(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Locutusque/UltraQwen-7B\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/UltraQwen-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:39:45.289395(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Locutusque/UltraQwen-7B\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/UltraQwen-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:39:45.289395(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
35b9df52b499aa3747193fe2a1757611d3193ee3
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:55:44.790706](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3/blob/main/results_2024-01-21T05-55-44.790706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5935864640982287, "acc_stderr": 0.03347215046465289, "acc_norm": 0.5988494527128462, "acc_norm_stderr": 0.034156196691824244, "mc1": 0.4675642594859241, "mc1_stderr": 0.017466632149577613, "mc2": 0.6299916888512991, "mc2_stderr": 0.015554627676562658 }, "harness|arc:challenge|25": { "acc": 0.5332764505119454, "acc_stderr": 0.014578995859605806, "acc_norm": 0.5998293515358362, "acc_norm_stderr": 0.014317197787809169 }, "harness|hellaswag|10": { "acc": 0.6346345349531965, "acc_stderr": 0.004805483767055348, "acc_norm": 0.8276239792869946, "acc_norm_stderr": 0.003769350079195883 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.04276349494376599, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.039889037033362836, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.0295822451283843, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.0295822451283843 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6458333333333334, "acc_stderr": 0.039994111357535424, "acc_norm": 0.6458333333333334, "acc_norm_stderr": 0.039994111357535424 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283648, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283648 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.046151869625837026, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.046151869625837026 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35714285714285715, "acc_stderr": 0.024677862841332783, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.024677862841332783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.03510766597959217, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6909090909090909, "acc_stderr": 0.036085410115739666, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.036085410115739666 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386417, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386417 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153303, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153303 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5564102564102564, "acc_stderr": 0.025189149894764205, "acc_norm": 0.5564102564102564, "acc_norm_stderr": 0.025189149894764205 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02874204090394849, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02874204090394849 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6008403361344538, "acc_stderr": 0.03181110032413926, "acc_norm": 0.6008403361344538, "acc_norm_stderr": 0.03181110032413926 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7889908256880734, "acc_stderr": 0.01749392240411265, "acc_norm": 0.7889908256880734, "acc_norm_stderr": 0.01749392240411265 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7401960784313726, "acc_stderr": 0.030778554678693268, "acc_norm": 0.7401960784313726, "acc_norm_stderr": 0.030778554678693268 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.032596251184168264, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.032596251184168264 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.04039314978724561, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.04039314978724561 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7107438016528925, "acc_stderr": 0.041391127276354626, "acc_norm": 0.7107438016528925, "acc_norm_stderr": 0.041391127276354626 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664743, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664743 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.023636873317489294, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.023636873317489294 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905713, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905713 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.653179190751445, "acc_stderr": 0.025624723994030454, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.025624723994030454 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33519553072625696, "acc_stderr": 0.015788007190185884, "acc_norm": 0.33519553072625696, "acc_norm_stderr": 0.015788007190185884 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6601307189542484, "acc_stderr": 0.027121956071388852, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.027121956071388852 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6591639871382636, "acc_stderr": 0.026920841260776162, "acc_norm": 0.6591639871382636, "acc_norm_stderr": 0.026920841260776162 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6851851851851852, "acc_stderr": 0.02584224870090217, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.02584224870090217 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41916558018252936, "acc_stderr": 0.012602244505788236, "acc_norm": 0.41916558018252936, "acc_norm_stderr": 0.012602244505788236 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5514705882352942, "acc_stderr": 0.030211479609121593, "acc_norm": 0.5514705882352942, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5980392156862745, "acc_stderr": 0.019835176484375393, "acc_norm": 0.5980392156862745, "acc_norm_stderr": 0.019835176484375393 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7761194029850746, "acc_stderr": 0.029475250236017204, "acc_norm": 0.7761194029850746, "acc_norm_stderr": 0.029475250236017204 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.4675642594859241, "mc1_stderr": 0.017466632149577613, "mc2": 0.6299916888512991, "mc2_stderr": 0.015554627676562658 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803143 }, "harness|gsm8k|5": { "acc": 0.3737680060652009, "acc_stderr": 0.013326342860737026 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3
[ "region:us" ]
2024-01-21T05:49:17+00:00
{"pretty_name": "Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3", "dataset_summary": "Dataset automatically created during the evaluation run of model [pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3](https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:55:44.790706](https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3/blob/main/results_2024-01-21T05-55-44.790706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5935864640982287,\n \"acc_stderr\": 0.03347215046465289,\n \"acc_norm\": 0.5988494527128462,\n \"acc_norm_stderr\": 0.034156196691824244,\n \"mc1\": 0.4675642594859241,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6299916888512991,\n \"mc2_stderr\": 0.015554627676562658\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5332764505119454,\n \"acc_stderr\": 0.014578995859605806,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6346345349531965,\n \"acc_stderr\": 0.004805483767055348,\n \"acc_norm\": 0.8276239792869946,\n \"acc_norm_stderr\": 0.003769350079195883\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693268,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693268\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.032596251184168264,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.032596251184168264\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.019835176484375393,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.019835176484375393\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4675642594859241,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6299916888512991,\n \"mc2_stderr\": 0.015554627676562658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803143\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3737680060652009,\n \"acc_stderr\": 0.013326342860737026\n }\n}\n```", "repo_url": "https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-47-00.811224.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-55-44.790706.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["**/details_harness|winogrande|5_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["**/details_harness|winogrande|5_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-55-44.790706.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_47_00.811224", "path": ["results_2024-01-21T05-47-00.811224.parquet"]}, {"split": "2024_01_21T05_55_44.790706", "path": ["results_2024-01-21T05-55-44.790706.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-55-44.790706.parquet"]}]}]}
2024-01-21T05:58:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3 Dataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:55:44.790706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:55:44.790706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3\n\n\n\nDataset automatically created during the evaluation run of model pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:55:44.790706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
980d9fb2c1cf990c5827afe683c08b3e5fad8cf9
# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_MOE_19B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [yunconglong/Truthful_DPO_MOE_19B](https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T05:49:46.084708](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B/blob/main/results_2024-01-21T05-49-46.084708.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6651543092121811, "acc_stderr": 0.03170223263040272, "acc_norm": 0.6659497156886632, "acc_norm_stderr": 0.03234845655117025, "mc1": 0.5801713586291309, "mc1_stderr": 0.017277030301775766, "mc2": 0.7229451135419377, "mc2_stderr": 0.014949043344645354 }, "harness|arc:challenge|25": { "acc": 0.6868600682593856, "acc_stderr": 0.013552671543623496, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393441 }, "harness|hellaswag|10": { "acc": 0.7132045409281019, "acc_stderr": 0.004513409114983828, "acc_norm": 0.8845847440748855, "acc_norm_stderr": 0.003188694028453633 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.743421052631579, "acc_stderr": 0.0355418036802569, "acc_norm": 0.743421052631579, "acc_norm_stderr": 0.0355418036802569 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.04461960433384741, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.625531914893617, "acc_stderr": 0.03163910665367291, "acc_norm": 0.625531914893617, "acc_norm_stderr": 0.03163910665367291 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.039966295748767186, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.039966295748767186 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4947089947089947, "acc_stderr": 0.02574986828855657, "acc_norm": 0.4947089947089947, "acc_norm_stderr": 0.02574986828855657 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8129032258064516, "acc_stderr": 0.022185710092252252, "acc_norm": 0.8129032258064516, "acc_norm_stderr": 0.022185710092252252 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.035179450386910616, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.806060606060606, "acc_stderr": 0.03087414513656209, "acc_norm": 0.806060606060606, "acc_norm_stderr": 0.03087414513656209 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8636363636363636, "acc_stderr": 0.024450155973189835, "acc_norm": 0.8636363636363636, "acc_norm_stderr": 0.024450155973189835 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644244, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644244 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02944316932303154, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02944316932303154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634335, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634335 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374308, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374308 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.03372343271653062, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.03372343271653062 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.02450980392156862, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.02450980392156862 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8481012658227848, "acc_stderr": 0.023363878096632446, "acc_norm": 0.8481012658227848, "acc_norm_stderr": 0.023363878096632446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776678, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776678 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728743, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728743 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.03492606476623791, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.03492606476623791 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459754, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459754 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913917, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913917 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.394413407821229, "acc_stderr": 0.01634538676210397, "acc_norm": 0.394413407821229, "acc_norm_stderr": 0.01634538676210397 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.0254942593506949, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.0254942593506949 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7808641975308642, "acc_stderr": 0.02301670564026219, "acc_norm": 0.7808641975308642, "acc_norm_stderr": 0.02301670564026219 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5070921985815603, "acc_stderr": 0.02982449855912901, "acc_norm": 0.5070921985815603, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4921773142112125, "acc_stderr": 0.0127686730761119, "acc_norm": 0.4921773142112125, "acc_norm_stderr": 0.0127686730761119 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7426470588235294, "acc_stderr": 0.026556519470041513, "acc_norm": 0.7426470588235294, "acc_norm_stderr": 0.026556519470041513 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.01882421951270621, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.01882421951270621 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.5801713586291309, "mc1_stderr": 0.017277030301775766, "mc2": 0.7229451135419377, "mc2_stderr": 0.014949043344645354 }, "harness|winogrande|5": { "acc": 0.8334648776637726, "acc_stderr": 0.010470796496781096 }, "harness|gsm8k|5": { "acc": 0.645185746777862, "acc_stderr": 0.013179083387979205 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B
[ "region:us" ]
2024-01-21T05:52:02+00:00
{"pretty_name": "Evaluation run of yunconglong/Truthful_DPO_MOE_19B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/Truthful_DPO_MOE_19B](https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T05:49:46.084708](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_MOE_19B/blob/main/results_2024-01-21T05-49-46.084708.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6651543092121811,\n \"acc_stderr\": 0.03170223263040272,\n \"acc_norm\": 0.6659497156886632,\n \"acc_norm_stderr\": 0.03234845655117025,\n \"mc1\": 0.5801713586291309,\n \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7229451135419377,\n \"mc2_stderr\": 0.014949043344645354\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623496,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7132045409281019,\n \"acc_stderr\": 0.004513409114983828,\n \"acc_norm\": 0.8845847440748855,\n \"acc_norm_stderr\": 0.003188694028453633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5801713586291309,\n \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.7229451135419377,\n \"mc2_stderr\": 0.014949043344645354\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781096\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.645185746777862,\n \"acc_stderr\": 0.013179083387979205\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/Truthful_DPO_MOE_19B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["**/details_harness|winogrande|5_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T05-49-46.084708.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T05_49_46.084708", "path": ["results_2024-01-21T05-49-46.084708.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T05-49-46.084708.parquet"]}]}]}
2024-01-21T05:52:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_MOE_19B Dataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_MOE_19B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T05:49:46.084708(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_MOE_19B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_MOE_19B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:49:46.084708(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_MOE_19B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_MOE_19B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T05:49:46.084708(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
88a775fa23e88a72a02031248d3705ed058172e5
# Dataset Card for Evaluation run of codemateai/CodeMate-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [codemateai/CodeMate-v0.1](https://huggingface.co/codemateai/CodeMate-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_codemateai__CodeMate-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T06:09:28.269211](https://huggingface.co/datasets/open-llm-leaderboard/details_codemateai__CodeMate-v0.1/blob/main/results_2024-01-21T06-09-28.269211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5539633133523945, "acc_stderr": 0.03410596348207958, "acc_norm": 0.5569723735975259, "acc_norm_stderr": 0.03480720286888438, "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113503, "mc2": 0.4864060014239976, "mc2_stderr": 0.015446552144630315 }, "harness|arc:challenge|25": { "acc": 0.5452218430034129, "acc_stderr": 0.014551507060836359, "acc_norm": 0.5554607508532423, "acc_norm_stderr": 0.01452122640562708 }, "harness|hellaswag|10": { "acc": 0.5930093606851224, "acc_stderr": 0.004902690765066425, "acc_norm": 0.7803226448914559, "acc_norm_stderr": 0.004131818797713872 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.04266763404099582, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.040335656678483205, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.040335656678483205 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5132075471698113, "acc_stderr": 0.030762134874500476, "acc_norm": 0.5132075471698113, "acc_norm_stderr": 0.030762134874500476 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5486111111111112, "acc_stderr": 0.041614023984032786, "acc_norm": 0.5486111111111112, "acc_norm_stderr": 0.041614023984032786 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.47398843930635837, "acc_stderr": 0.03807301726504511, "acc_norm": 0.47398843930635837, "acc_norm_stderr": 0.03807301726504511 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.502127659574468, "acc_stderr": 0.032685726586674915, "acc_norm": 0.502127659574468, "acc_norm_stderr": 0.032685726586674915 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.04615186962583703, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.04615186962583703 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.04166567577101579, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440677, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440677 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6, "acc_stderr": 0.027869320571664632, "acc_norm": 0.6, "acc_norm_stderr": 0.027869320571664632 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4088669950738916, "acc_stderr": 0.034590588158832314, "acc_norm": 0.4088669950738916, "acc_norm_stderr": 0.034590588158832314 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.702020202020202, "acc_stderr": 0.03258630383836557, "acc_norm": 0.702020202020202, "acc_norm_stderr": 0.03258630383836557 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7668393782383419, "acc_stderr": 0.03051611137147601, "acc_norm": 0.7668393782383419, "acc_norm_stderr": 0.03051611137147601 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5282051282051282, "acc_stderr": 0.025310639254933882, "acc_norm": 0.5282051282051282, "acc_norm_stderr": 0.025310639254933882 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340492, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.032385469487589795, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7229357798165138, "acc_stderr": 0.01918848259016953, "acc_norm": 0.7229357798165138, "acc_norm_stderr": 0.01918848259016953 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538272, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501943, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501943 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036433, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036433 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5739910313901345, "acc_stderr": 0.03318833286217281, "acc_norm": 0.5739910313901345, "acc_norm_stderr": 0.03318833286217281 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5343511450381679, "acc_stderr": 0.04374928560599738, "acc_norm": 0.5343511450381679, "acc_norm_stderr": 0.04374928560599738 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.036230899157241474, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.036230899157241474 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260594, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260594 }, "harness|hendrycksTest-marketing|5": { "acc": 0.782051282051282, "acc_stderr": 0.027046857630716677, "acc_norm": 0.782051282051282, "acc_norm_stderr": 0.027046857630716677 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7011494252873564, "acc_stderr": 0.016369256815093127, "acc_norm": 0.7011494252873564, "acc_norm_stderr": 0.016369256815093127 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5982658959537572, "acc_stderr": 0.026394104177643637, "acc_norm": 0.5982658959537572, "acc_norm_stderr": 0.026394104177643637 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.35083798882681566, "acc_stderr": 0.015961036675230963, "acc_norm": 0.35083798882681566, "acc_norm_stderr": 0.015961036675230963 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5588235294117647, "acc_stderr": 0.028431095444176643, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.028431095444176643 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.02795048149440127, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.02795048149440127 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.595679012345679, "acc_stderr": 0.02730662529732768, "acc_norm": 0.595679012345679, "acc_norm_stderr": 0.02730662529732768 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.39361702127659576, "acc_stderr": 0.029144544781596143, "acc_norm": 0.39361702127659576, "acc_norm_stderr": 0.029144544781596143 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39374185136897, "acc_stderr": 0.012478532272564439, "acc_norm": 0.39374185136897, "acc_norm_stderr": 0.012478532272564439 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4963235294117647, "acc_stderr": 0.030372015885428188, "acc_norm": 0.4963235294117647, "acc_norm_stderr": 0.030372015885428188 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5130718954248366, "acc_stderr": 0.020220920829626916, "acc_norm": 0.5130718954248366, "acc_norm_stderr": 0.020220920829626916 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.673469387755102, "acc_stderr": 0.03002105623844031, "acc_norm": 0.673469387755102, "acc_norm_stderr": 0.03002105623844031 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.03115715086935558, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.03115715086935558 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113503, "mc2": 0.4864060014239976, "mc2_stderr": 0.015446552144630315 }, "harness|winogrande|5": { "acc": 0.7261247040252565, "acc_stderr": 0.0125332927326203 }, "harness|gsm8k|5": { "acc": 0.40181956027293403, "acc_stderr": 0.01350435778749404 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_codemateai__CodeMate-v0.1
[ "region:us" ]
2024-01-21T06:11:48+00:00
{"pretty_name": "Evaluation run of codemateai/CodeMate-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [codemateai/CodeMate-v0.1](https://huggingface.co/codemateai/CodeMate-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codemateai__CodeMate-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T06:09:28.269211](https://huggingface.co/datasets/open-llm-leaderboard/details_codemateai__CodeMate-v0.1/blob/main/results_2024-01-21T06-09-28.269211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5539633133523945,\n \"acc_stderr\": 0.03410596348207958,\n \"acc_norm\": 0.5569723735975259,\n \"acc_norm_stderr\": 0.03480720286888438,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4864060014239976,\n \"mc2_stderr\": 0.015446552144630315\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836359,\n \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.01452122640562708\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5930093606851224,\n \"acc_stderr\": 0.004902690765066425,\n \"acc_norm\": 0.7803226448914559,\n \"acc_norm_stderr\": 0.004131818797713872\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440677,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440677\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933882,\n \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933882\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036433,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036433\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7011494252873564,\n \"acc_stderr\": 0.016369256815093127,\n \"acc_norm\": 0.7011494252873564,\n \"acc_norm_stderr\": 0.016369256815093127\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643637,\n \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643637\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.015961036675230963,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.015961036675230963\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.02730662529732768,\n \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.02730662529732768\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39374185136897,\n \"acc_stderr\": 0.012478532272564439,\n \"acc_norm\": 0.39374185136897,\n \"acc_norm_stderr\": 0.012478532272564439\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428188,\n \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428188\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.020220920829626916,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.020220920829626916\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935558,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4864060014239976,\n \"mc2_stderr\": 0.015446552144630315\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.0125332927326203\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.01350435778749404\n }\n}\n```", "repo_url": "https://huggingface.co/codemateai/CodeMate-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-09-28.269211.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["**/details_harness|winogrande|5_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T06-09-28.269211.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T06_09_28.269211", "path": ["results_2024-01-21T06-09-28.269211.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T06-09-28.269211.parquet"]}]}]}
2024-01-21T06:12:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of codemateai/CodeMate-v0.1 Dataset automatically created during the evaluation run of model codemateai/CodeMate-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T06:09:28.269211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of codemateai/CodeMate-v0.1\n\n\n\nDataset automatically created during the evaluation run of model codemateai/CodeMate-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:09:28.269211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of codemateai/CodeMate-v0.1\n\n\n\nDataset automatically created during the evaluation run of model codemateai/CodeMate-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:09:28.269211(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
140c315e21da526a7073b19fb1b4e78ef4e801c9
# Dataset Card for "Agent Profiles and Capabilities Dataset" ## Table of Contents - [Dataset Description](#dataset-description) - [Data Structure](#data-structure) - [Intended Use](#intended-use) - [Data Collection and Preparation](#data-collection-and-preparation) - [Dataset Limitations](#dataset-limitations) - [Ethical Considerations](#ethical-considerations) - [Licensing and Access](#licensing-and-access) ## Dataset Description ### General Description This dataset contains detailed profiles of agents, including their reasoning capabilities, workflow descriptions, technical documents, and discussion guides. Each profile provides insights into specific areas of expertise, such as network installation, payment processing, and scheduling systems. The dataset is designed to support the development of intelligent systems that can understand and simulate complex workflows and technical discussions. ### Context The data is fully synthetic, created to reflect real-world applications in technology and business sectors without using actual data from existing sources. ### Content Summary Each row in the dataset includes a reasoning section, workflow details with API specifications, technical documents, and a discussion guide with factual and trick question-and-answer pairs. ## Data Structure - **Format**: JSON Lines (JSONL) - **Schema**: - `reasoning`: Textual reasoning and context explanation. - `workflow`: Array of objects detailing APIs involved in the workflow. - `document`: A technical or reference document. - `discussionFactualTrick`: Object containing a discussion guide, factual Q&A, and trick Q&A. ## Intended Use - **Target Audience**: Researchers and developers in artificial intelligence, particularly those working on natural language understanding, dialogue systems, and workflow automation. - **Applications**: Training AI models for technical support chatbots, workflow automation tools, and educational platforms focusing on technical training and troubleshooting. ## Data Collection and Preparation - **Collection Method**: The dataset is entirely synthetic, generated to simulate real-world technical scenarios and discussions. - **Preprocessing**: Data is structured into a consistent JSONL format, with each row representing a comprehensive agent profile. ## Dataset Limitations - **Representation Bias**: The dataset might be skewed towards certain types of technical workflows and may not represent a diverse range of industries or non-technical scenarios. - **Contextual Limitation**: The dataset focuses on technical aspects and might not adequately cover soft skills or non-technical discussions. ## Ethical Considerations - **Data Privacy**: The dataset is synthetic and does not contain personally identifiable information or real proprietary data. - **Use Case Restrictions**: Intended for research and development purposes. Users should consider ethical implications when deploying models trained on this data in real-world applications. ## Licensing and Access - **License**: None - **Access**: None
Cyleux/fullJan20Agents
[ "region:us" ]
2024-01-21T06:15:52+00:00
{}
2024-01-21T06:23:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "Agent Profiles and Capabilities Dataset" ## Table of Contents - Dataset Description - Data Structure - Intended Use - Data Collection and Preparation - Dataset Limitations - Ethical Considerations - Licensing and Access ## Dataset Description ### General Description This dataset contains detailed profiles of agents, including their reasoning capabilities, workflow descriptions, technical documents, and discussion guides. Each profile provides insights into specific areas of expertise, such as network installation, payment processing, and scheduling systems. The dataset is designed to support the development of intelligent systems that can understand and simulate complex workflows and technical discussions. ### Context The data is fully synthetic, created to reflect real-world applications in technology and business sectors without using actual data from existing sources. ### Content Summary Each row in the dataset includes a reasoning section, workflow details with API specifications, technical documents, and a discussion guide with factual and trick question-and-answer pairs. ## Data Structure - Format: JSON Lines (JSONL) - Schema: - 'reasoning': Textual reasoning and context explanation. - 'workflow': Array of objects detailing APIs involved in the workflow. - 'document': A technical or reference document. - 'discussionFactualTrick': Object containing a discussion guide, factual Q&A, and trick Q&A. ## Intended Use - Target Audience: Researchers and developers in artificial intelligence, particularly those working on natural language understanding, dialogue systems, and workflow automation. - Applications: Training AI models for technical support chatbots, workflow automation tools, and educational platforms focusing on technical training and troubleshooting. ## Data Collection and Preparation - Collection Method: The dataset is entirely synthetic, generated to simulate real-world technical scenarios and discussions. - Preprocessing: Data is structured into a consistent JSONL format, with each row representing a comprehensive agent profile. ## Dataset Limitations - Representation Bias: The dataset might be skewed towards certain types of technical workflows and may not represent a diverse range of industries or non-technical scenarios. - Contextual Limitation: The dataset focuses on technical aspects and might not adequately cover soft skills or non-technical discussions. ## Ethical Considerations - Data Privacy: The dataset is synthetic and does not contain personally identifiable information or real proprietary data. - Use Case Restrictions: Intended for research and development purposes. Users should consider ethical implications when deploying models trained on this data in real-world applications. ## Licensing and Access - License: None - Access: None
[ "# Dataset Card for \"Agent Profiles and Capabilities Dataset\"", "## Table of Contents\n- Dataset Description\n- Data Structure\n- Intended Use\n- Data Collection and Preparation\n- Dataset Limitations\n- Ethical Considerations\n- Licensing and Access", "## Dataset Description", "### General Description\nThis dataset contains detailed profiles of agents, including their reasoning capabilities, workflow descriptions, technical documents, and discussion guides. Each profile provides insights into specific areas of expertise, such as network installation, payment processing, and scheduling systems. The dataset is designed to support the development of intelligent systems that can understand and simulate complex workflows and technical discussions.", "### Context\nThe data is fully synthetic, created to reflect real-world applications in technology and business sectors without using actual data from existing sources.", "### Content Summary\nEach row in the dataset includes a reasoning section, workflow details with API specifications, technical documents, and a discussion guide with factual and trick question-and-answer pairs.", "## Data Structure\n\n- Format: JSON Lines (JSONL)\n- Schema:\n - 'reasoning': Textual reasoning and context explanation.\n - 'workflow': Array of objects detailing APIs involved in the workflow.\n - 'document': A technical or reference document.\n - 'discussionFactualTrick': Object containing a discussion guide, factual Q&A, and trick Q&A.", "## Intended Use\n\n- Target Audience: Researchers and developers in artificial intelligence, particularly those working on natural language understanding, dialogue systems, and workflow automation.\n- Applications: Training AI models for technical support chatbots, workflow automation tools, and educational platforms focusing on technical training and troubleshooting.", "## Data Collection and Preparation\n\n- Collection Method: The dataset is entirely synthetic, generated to simulate real-world technical scenarios and discussions.\n- Preprocessing: Data is structured into a consistent JSONL format, with each row representing a comprehensive agent profile.", "## Dataset Limitations\n\n- Representation Bias: The dataset might be skewed towards certain types of technical workflows and may not represent a diverse range of industries or non-technical scenarios.\n- Contextual Limitation: The dataset focuses on technical aspects and might not adequately cover soft skills or non-technical discussions.", "## Ethical Considerations\n\n- Data Privacy: The dataset is synthetic and does not contain personally identifiable information or real proprietary data.\n- Use Case Restrictions: Intended for research and development purposes. Users should consider ethical implications when deploying models trained on this data in real-world applications.", "## Licensing and Access\n\n- License: None\n- Access: None" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"Agent Profiles and Capabilities Dataset\"", "## Table of Contents\n- Dataset Description\n- Data Structure\n- Intended Use\n- Data Collection and Preparation\n- Dataset Limitations\n- Ethical Considerations\n- Licensing and Access", "## Dataset Description", "### General Description\nThis dataset contains detailed profiles of agents, including their reasoning capabilities, workflow descriptions, technical documents, and discussion guides. Each profile provides insights into specific areas of expertise, such as network installation, payment processing, and scheduling systems. The dataset is designed to support the development of intelligent systems that can understand and simulate complex workflows and technical discussions.", "### Context\nThe data is fully synthetic, created to reflect real-world applications in technology and business sectors without using actual data from existing sources.", "### Content Summary\nEach row in the dataset includes a reasoning section, workflow details with API specifications, technical documents, and a discussion guide with factual and trick question-and-answer pairs.", "## Data Structure\n\n- Format: JSON Lines (JSONL)\n- Schema:\n - 'reasoning': Textual reasoning and context explanation.\n - 'workflow': Array of objects detailing APIs involved in the workflow.\n - 'document': A technical or reference document.\n - 'discussionFactualTrick': Object containing a discussion guide, factual Q&A, and trick Q&A.", "## Intended Use\n\n- Target Audience: Researchers and developers in artificial intelligence, particularly those working on natural language understanding, dialogue systems, and workflow automation.\n- Applications: Training AI models for technical support chatbots, workflow automation tools, and educational platforms focusing on technical training and troubleshooting.", "## Data Collection and Preparation\n\n- Collection Method: The dataset is entirely synthetic, generated to simulate real-world technical scenarios and discussions.\n- Preprocessing: Data is structured into a consistent JSONL format, with each row representing a comprehensive agent profile.", "## Dataset Limitations\n\n- Representation Bias: The dataset might be skewed towards certain types of technical workflows and may not represent a diverse range of industries or non-technical scenarios.\n- Contextual Limitation: The dataset focuses on technical aspects and might not adequately cover soft skills or non-technical discussions.", "## Ethical Considerations\n\n- Data Privacy: The dataset is synthetic and does not contain personally identifiable information or real proprietary data.\n- Use Case Restrictions: Intended for research and development purposes. Users should consider ethical implications when deploying models trained on this data in real-world applications.", "## Licensing and Access\n\n- License: None\n- Access: None" ]
e8f65aa6d929daa48a970e19279c9ff4e341801f
# Dataset Card for Evaluation run of genaicore3434/MistralLite-summ-sft-e1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [genaicore3434/MistralLite-summ-sft-e1](https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T06:35:30.064229](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1/blob/main/results_2024-01-21T06-35-30.064229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5209243088809142, "acc_stderr": 0.034285125251915134, "acc_norm": 0.5285550597511598, "acc_norm_stderr": 0.03510432192642734, "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834555, "mc2": 0.40848131883657496, "mc2_stderr": 0.014577935602536028 }, "harness|arc:challenge|25": { "acc": 0.5358361774744027, "acc_stderr": 0.014573813664735718, "acc_norm": 0.575938566552901, "acc_norm_stderr": 0.0144418896274644 }, "harness|hellaswag|10": { "acc": 0.6031666998605856, "acc_stderr": 0.0048824100299354415, "acc_norm": 0.8066122286397132, "acc_norm_stderr": 0.003941471781664182 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.04065771002562605, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.04065771002562605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6388888888888888, "acc_stderr": 0.04016660030451233, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.04016660030451233 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.04615186962583703, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.04615186962583703 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3412698412698413, "acc_stderr": 0.02441923496681907, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.02441923496681907 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4935483870967742, "acc_stderr": 0.02844163823354051, "acc_norm": 0.4935483870967742, "acc_norm_stderr": 0.02844163823354051 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4088669950738916, "acc_stderr": 0.03459058815883231, "acc_norm": 0.4088669950738916, "acc_norm_stderr": 0.03459058815883231 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.03781887353205982, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.03781887353205982 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6414141414141414, "acc_stderr": 0.03416903640391521, "acc_norm": 0.6414141414141414, "acc_norm_stderr": 0.03416903640391521 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.772020725388601, "acc_stderr": 0.030276909945178267, "acc_norm": 0.772020725388601, "acc_norm_stderr": 0.030276909945178267 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5769230769230769, "acc_stderr": 0.02504919787604234, "acc_norm": 0.5769230769230769, "acc_norm_stderr": 0.02504919787604234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5798319327731093, "acc_stderr": 0.03206183783236152, "acc_norm": 0.5798319327731093, "acc_norm_stderr": 0.03206183783236152 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5889908256880734, "acc_stderr": 0.021095050687277652, "acc_norm": 0.5889908256880734, "acc_norm_stderr": 0.021095050687277652 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.032282103870378914, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.032282103870378914 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.02875679962965834, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6143497757847534, "acc_stderr": 0.03266842214289201, "acc_norm": 0.6143497757847534, "acc_norm_stderr": 0.03266842214289201 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5648854961832062, "acc_stderr": 0.04348208051644858, "acc_norm": 0.5648854961832062, "acc_norm_stderr": 0.04348208051644858 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884123, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884123 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6759259259259259, "acc_stderr": 0.04524596007030049, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.04524596007030049 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.6990291262135923, "acc_stderr": 0.04541609446503947, "acc_norm": 0.6990291262135923, "acc_norm_stderr": 0.04541609446503947 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6709401709401709, "acc_stderr": 0.030782321577688183, "acc_norm": 0.6709401709401709, "acc_norm_stderr": 0.030782321577688183 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6628352490421456, "acc_stderr": 0.016905207420803554, "acc_norm": 0.6628352490421456, "acc_norm_stderr": 0.016905207420803554 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5924855491329479, "acc_stderr": 0.026454578146931505, "acc_norm": 0.5924855491329479, "acc_norm_stderr": 0.026454578146931505 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.28268156424581004, "acc_stderr": 0.015060381730018106, "acc_norm": 0.28268156424581004, "acc_norm_stderr": 0.015060381730018106 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.545751633986928, "acc_stderr": 0.028509807802626592, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.028509807802626592 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6334405144694534, "acc_stderr": 0.02736807824397163, "acc_norm": 0.6334405144694534, "acc_norm_stderr": 0.02736807824397163 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.02762873715566877, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.02762873715566877 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614112, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614112 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3983050847457627, "acc_stderr": 0.012503310565166247, "acc_norm": 0.3983050847457627, "acc_norm_stderr": 0.012503310565166247 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4889705882352941, "acc_stderr": 0.030365446477275668, "acc_norm": 0.4889705882352941, "acc_norm_stderr": 0.030365446477275668 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5098039215686274, "acc_stderr": 0.0202239460050743, "acc_norm": 0.5098039215686274, "acc_norm_stderr": 0.0202239460050743 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154185, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.46766169154228854, "acc_stderr": 0.035281314729336065, "acc_norm": 0.46766169154228854, "acc_norm_stderr": 0.035281314729336065 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6198830409356725, "acc_stderr": 0.03722965741385539, "acc_norm": 0.6198830409356725, "acc_norm_stderr": 0.03722965741385539 }, "harness|truthfulqa:mc|0": { "mc1": 0.26805385556915545, "mc1_stderr": 0.015506204722834555, "mc2": 0.40848131883657496, "mc2_stderr": 0.014577935602536028 }, "harness|winogrande|5": { "acc": 0.7616416732438832, "acc_stderr": 0.011974948667702311 }, "harness|gsm8k|5": { "acc": 0.07354056103108415, "acc_stderr": 0.007189835754365272 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1
[ "region:us" ]
2024-01-21T06:18:18+00:00
{"pretty_name": "Evaluation run of genaicore3434/MistralLite-summ-sft-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [genaicore3434/MistralLite-summ-sft-e1](https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T06:35:30.064229](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1/blob/main/results_2024-01-21T06-35-30.064229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5209243088809142,\n \"acc_stderr\": 0.034285125251915134,\n \"acc_norm\": 0.5285550597511598,\n \"acc_norm_stderr\": 0.03510432192642734,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.40848131883657496,\n \"mc2_stderr\": 0.014577935602536028\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5358361774744027,\n \"acc_stderr\": 0.014573813664735718,\n \"acc_norm\": 0.575938566552901,\n \"acc_norm_stderr\": 0.0144418896274644\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6031666998605856,\n \"acc_stderr\": 0.0048824100299354415,\n \"acc_norm\": 0.8066122286397132,\n \"acc_norm_stderr\": 0.003941471781664182\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.03459058815883231,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.03459058815883231\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178267,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178267\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5889908256880734,\n \"acc_stderr\": 0.021095050687277652,\n \"acc_norm\": 0.5889908256880734,\n \"acc_norm_stderr\": 0.021095050687277652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.032282103870378914,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378914\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.030782321577688183,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.030782321577688183\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6628352490421456,\n \"acc_stderr\": 0.016905207420803554,\n \"acc_norm\": 0.6628352490421456,\n \"acc_norm_stderr\": 0.016905207420803554\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n \"acc_stderr\": 0.015060381730018106,\n \"acc_norm\": 0.28268156424581004,\n \"acc_norm_stderr\": 0.015060381730018106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.02762873715566877,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.02762873715566877\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614112,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614112\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n \"acc_stderr\": 0.012503310565166247,\n \"acc_norm\": 0.3983050847457627,\n \"acc_norm_stderr\": 0.012503310565166247\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.46766169154228854,\n \"acc_stderr\": 0.035281314729336065,\n \"acc_norm\": 0.46766169154228854,\n \"acc_norm_stderr\": 0.035281314729336065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.40848131883657496,\n \"mc2_stderr\": 0.014577935602536028\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702311\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \"acc_stderr\": 0.007189835754365272\n }\n}\n```", "repo_url": "https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-15-57.278961.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-23-01.357164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["**/details_harness|winogrande|5_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["**/details_harness|winogrande|5_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["**/details_harness|winogrande|5_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T06-35-30.064229.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T06_15_57.278961", "path": ["results_2024-01-21T06-15-57.278961.parquet"]}, {"split": "2024_01_21T06_23_01.357164", "path": ["results_2024-01-21T06-23-01.357164.parquet"]}, {"split": "2024_01_21T06_35_30.064229", "path": ["results_2024-01-21T06-35-30.064229.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T06-35-30.064229.parquet"]}]}]}
2024-01-21T06:38:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of genaicore3434/MistralLite-summ-sft-e1 Dataset automatically created during the evaluation run of model genaicore3434/MistralLite-summ-sft-e1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T06:35:30.064229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of genaicore3434/MistralLite-summ-sft-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/MistralLite-summ-sft-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:35:30.064229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of genaicore3434/MistralLite-summ-sft-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/MistralLite-summ-sft-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:35:30.064229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7f1ef0b5907d72fee2db66d2b86c1022e896c727
# Malicious Logs These are malicious logs collected from my Nginx server. Isoration forest is used to collect these logs. Model: [u-haru/log-inspector](https://huggingface.co/u-haru/log-inspector) Code: [github.com/u-haru/log-inspector](https://github.com/u-haru/log-inspector)
u-haru/malicious_logs
[ "task_categories:text-classification", "size_categories:1M<n<10M", "license:cc-by-sa-4.0", "region:us" ]
2024-01-21T06:18:35+00:00
{"license": "cc-by-sa-4.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-classification"], "configs": [{"config_name": "access_log", "data_files": "access.txt"}]}
2024-01-21T06:32:06+00:00
[]
[]
TAGS #task_categories-text-classification #size_categories-1M<n<10M #license-cc-by-sa-4.0 #region-us
# Malicious Logs These are malicious logs collected from my Nginx server. Isoration forest is used to collect these logs. Model: u-haru/log-inspector Code: URL
[ "# Malicious Logs\n\nThese are malicious logs collected from my Nginx server.\n\nIsoration forest is used to collect these logs.\n\nModel: u-haru/log-inspector \nCode: URL" ]
[ "TAGS\n#task_categories-text-classification #size_categories-1M<n<10M #license-cc-by-sa-4.0 #region-us \n", "# Malicious Logs\n\nThese are malicious logs collected from my Nginx server.\n\nIsoration forest is used to collect these logs.\n\nModel: u-haru/log-inspector \nCode: URL" ]
3760596ca920a9dc5085ebeff976eab079b861bd
# Dataset Card for Evaluation run of AA051610/A0121 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051610/A0121](https://huggingface.co/AA051610/A0121) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051610__A0121", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T06:38:55.912986](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0121/blob/main/results_2024-01-21T06-38-55.912986.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7447539059303026, "acc_stderr": 0.028988249523275268, "acc_norm": 0.7497779847824153, "acc_norm_stderr": 0.029529280939468258, "mc1": 0.40514075887392903, "mc1_stderr": 0.01718561172775337, "mc2": 0.5961151164765033, "mc2_stderr": 0.01535344953348685 }, "harness|arc:challenge|25": { "acc": 0.6467576791808873, "acc_stderr": 0.013967822714840056, "acc_norm": 0.6715017064846417, "acc_norm_stderr": 0.0137249784655373 }, "harness|hellaswag|10": { "acc": 0.6623182632941645, "acc_stderr": 0.004719529099913134, "acc_norm": 0.854511053574985, "acc_norm_stderr": 0.00351872525736559 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7555555555555555, "acc_stderr": 0.03712537833614866, "acc_norm": 0.7555555555555555, "acc_norm_stderr": 0.03712537833614866 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.868421052631579, "acc_stderr": 0.027508689533549912, "acc_norm": 0.868421052631579, "acc_norm_stderr": 0.027508689533549912 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7924528301886793, "acc_stderr": 0.02495991802891127, "acc_norm": 0.7924528301886793, "acc_norm_stderr": 0.02495991802891127 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8263888888888888, "acc_stderr": 0.03167473383795718, "acc_norm": 0.8263888888888888, "acc_norm_stderr": 0.03167473383795718 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.03496101481191178, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.03496101481191178 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5294117647058824, "acc_stderr": 0.049665709039785295, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.049665709039785295 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7574468085106383, "acc_stderr": 0.02802022627120022, "acc_norm": 0.7574468085106383, "acc_norm_stderr": 0.02802022627120022 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6403508771929824, "acc_stderr": 0.04514496132873633, "acc_norm": 0.6403508771929824, "acc_norm_stderr": 0.04514496132873633 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7448275862068966, "acc_stderr": 0.03632984052707842, "acc_norm": 0.7448275862068966, "acc_norm_stderr": 0.03632984052707842 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6825396825396826, "acc_stderr": 0.02397386199899207, "acc_norm": 0.6825396825396826, "acc_norm_stderr": 0.02397386199899207 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5396825396825397, "acc_stderr": 0.04458029125470973, "acc_norm": 0.5396825396825397, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8806451612903226, "acc_stderr": 0.018443411325315434, "acc_norm": 0.8806451612903226, "acc_norm_stderr": 0.018443411325315434 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5665024630541872, "acc_stderr": 0.03486731727419872, "acc_norm": 0.5665024630541872, "acc_norm_stderr": 0.03486731727419872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.041633319989322605, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8303030303030303, "acc_stderr": 0.02931118867498311, "acc_norm": 0.8303030303030303, "acc_norm_stderr": 0.02931118867498311 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9141414141414141, "acc_stderr": 0.01996022556317289, "acc_norm": 0.9141414141414141, "acc_norm_stderr": 0.01996022556317289 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9585492227979274, "acc_stderr": 0.014385432857476442, "acc_norm": 0.9585492227979274, "acc_norm_stderr": 0.014385432857476442 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8051282051282052, "acc_stderr": 0.020083167595181393, "acc_norm": 0.8051282051282052, "acc_norm_stderr": 0.020083167595181393 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.43333333333333335, "acc_stderr": 0.030213340289237924, "acc_norm": 0.43333333333333335, "acc_norm_stderr": 0.030213340289237924 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8445378151260504, "acc_stderr": 0.023536818625398897, "acc_norm": 0.8445378151260504, "acc_norm_stderr": 0.023536818625398897 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4900662251655629, "acc_stderr": 0.04081677107248437, "acc_norm": 0.4900662251655629, "acc_norm_stderr": 0.04081677107248437 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9119266055045872, "acc_stderr": 0.012150743719481653, "acc_norm": 0.9119266055045872, "acc_norm_stderr": 0.012150743719481653 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6388888888888888, "acc_stderr": 0.032757734861009996, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.032757734861009996 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9117647058823529, "acc_stderr": 0.019907399791316942, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.019907399791316942 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884565, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884565 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929196, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929196 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8854961832061069, "acc_stderr": 0.027927473753597453, "acc_norm": 0.8854961832061069, "acc_norm_stderr": 0.027927473753597453 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9090909090909091, "acc_stderr": 0.02624319405407388, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.02624319405407388 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8703703703703703, "acc_stderr": 0.03247224389917948, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.03247224389917948 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8650306748466258, "acc_stderr": 0.02684576505455385, "acc_norm": 0.8650306748466258, "acc_norm_stderr": 0.02684576505455385 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5625, "acc_stderr": 0.04708567521880525, "acc_norm": 0.5625, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.883495145631068, "acc_stderr": 0.031766839486404054, "acc_norm": 0.883495145631068, "acc_norm_stderr": 0.031766839486404054 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9273504273504274, "acc_stderr": 0.01700436856813233, "acc_norm": 0.9273504273504274, "acc_norm_stderr": 0.01700436856813233 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.85, "acc_stderr": 0.035887028128263714, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263714 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9118773946360154, "acc_stderr": 0.010136978203312642, "acc_norm": 0.9118773946360154, "acc_norm_stderr": 0.010136978203312642 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7890173410404624, "acc_stderr": 0.021966309947043114, "acc_norm": 0.7890173410404624, "acc_norm_stderr": 0.021966309947043114 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6927374301675978, "acc_stderr": 0.015430158846469606, "acc_norm": 0.6927374301675978, "acc_norm_stderr": 0.015430158846469606 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8398692810457516, "acc_stderr": 0.020998740930362306, "acc_norm": 0.8398692810457516, "acc_norm_stderr": 0.020998740930362306 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8070739549839229, "acc_stderr": 0.022411516780911366, "acc_norm": 0.8070739549839229, "acc_norm_stderr": 0.022411516780911366 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8333333333333334, "acc_stderr": 0.020736358408060002, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.020736358408060002 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6205673758865248, "acc_stderr": 0.028947338851614095, "acc_norm": 0.6205673758865248, "acc_norm_stderr": 0.028947338851614095 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5710560625814863, "acc_stderr": 0.012640625443067365, "acc_norm": 0.5710560625814863, "acc_norm_stderr": 0.012640625443067365 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8198529411764706, "acc_stderr": 0.02334516361654486, "acc_norm": 0.8198529411764706, "acc_norm_stderr": 0.02334516361654486 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7892156862745098, "acc_stderr": 0.016500472979024808, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.016500472979024808 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8081632653061225, "acc_stderr": 0.0252069631542254, "acc_norm": 0.8081632653061225, "acc_norm_stderr": 0.0252069631542254 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8805970149253731, "acc_stderr": 0.02292879327721974, "acc_norm": 0.8805970149253731, "acc_norm_stderr": 0.02292879327721974 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8830409356725146, "acc_stderr": 0.024648068961366152, "acc_norm": 0.8830409356725146, "acc_norm_stderr": 0.024648068961366152 }, "harness|truthfulqa:mc|0": { "mc1": 0.40514075887392903, "mc1_stderr": 0.01718561172775337, "mc2": 0.5961151164765033, "mc2_stderr": 0.01535344953348685 }, "harness|winogrande|5": { "acc": 0.8042620363062352, "acc_stderr": 0.011151145042218332 }, "harness|gsm8k|5": { "acc": 0.6057619408642911, "acc_stderr": 0.013460852357095661 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051610__A0121
[ "region:us" ]
2024-01-21T06:41:09+00:00
{"pretty_name": "Evaluation run of AA051610/A0121", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/A0121](https://huggingface.co/AA051610/A0121) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A0121\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T06:38:55.912986](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0121/blob/main/results_2024-01-21T06-38-55.912986.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7447539059303026,\n \"acc_stderr\": 0.028988249523275268,\n \"acc_norm\": 0.7497779847824153,\n \"acc_norm_stderr\": 0.029529280939468258,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5961151164765033,\n \"mc2_stderr\": 0.01535344953348685\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840056,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.0137249784655373\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n \"acc_stderr\": 0.004719529099913134,\n \"acc_norm\": 0.854511053574985,\n \"acc_norm_stderr\": 0.00351872525736559\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.03496101481191178,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.03496101481191178\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.02802022627120022,\n \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.02802022627120022\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6825396825396826,\n \"acc_stderr\": 0.02397386199899207,\n \"acc_norm\": 0.6825396825396826,\n \"acc_norm_stderr\": 0.02397386199899207\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8806451612903226,\n \"acc_stderr\": 0.018443411325315434,\n \"acc_norm\": 0.8806451612903226,\n \"acc_norm_stderr\": 0.018443411325315434\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.03486731727419872,\n \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.03486731727419872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498311,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498311\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237924,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237924\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398897,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398897\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481653,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481653\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813233,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813233\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9118773946360154,\n \"acc_stderr\": 0.010136978203312642,\n \"acc_norm\": 0.9118773946360154,\n \"acc_norm_stderr\": 0.010136978203312642\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043114,\n \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6927374301675978,\n \"acc_stderr\": 0.015430158846469606,\n \"acc_norm\": 0.6927374301675978,\n \"acc_norm_stderr\": 0.015430158846469606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362306,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362306\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060002,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n \"acc_stderr\": 0.012640625443067365,\n \"acc_norm\": 0.5710560625814863,\n \"acc_norm_stderr\": 0.012640625443067365\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654486,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654486\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.016500472979024808,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.016500472979024808\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.0252069631542254,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.0252069631542254\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5961151164765033,\n \"mc2_stderr\": 0.01535344953348685\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218332\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6057619408642911,\n \"acc_stderr\": 0.013460852357095661\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/A0121", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-38-55.912986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["**/details_harness|winogrande|5_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T06-38-55.912986.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T06_38_55.912986", "path": ["results_2024-01-21T06-38-55.912986.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T06-38-55.912986.parquet"]}]}]}
2024-01-21T06:41:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051610/A0121 Dataset automatically created during the evaluation run of model AA051610/A0121 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T06:38:55.912986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051610/A0121\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0121 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:38:55.912986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051610/A0121\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A0121 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:38:55.912986(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1dc32c25e41491bd7dcc5729474336ec13414ed1
210 rows from the [MS_MARCO Dataset](https://huggingface.co/datasets/ms_marco) reworked for training via Direct Preference Optimization. The prompt format is for the [Mistral Instruct](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) Models. Original Dataset is not mine. Posted this as it may be of use to others.
Venkat-Ram-Rao/msmarco_subset_for_dpo_llm_ranker
[ "region:us" ]
2024-01-21T06:41:32+00:00
{}
2024-01-21T06:45:05+00:00
[]
[]
TAGS #region-us
210 rows from the MS_MARCO Dataset reworked for training via Direct Preference Optimization. The prompt format is for the Mistral Instruct Models. Original Dataset is not mine. Posted this as it may be of use to others.
[]
[ "TAGS\n#region-us \n" ]
6cebee2458c2703b44e093f190e8914aad0056e7
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T06:47:29.951488](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1/blob/main/results_2024-01-21T06-47-29.951488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.59052978065543, "acc_stderr": 0.03349268505074206, "acc_norm": 0.5952047695238794, "acc_norm_stderr": 0.03418111471832376, "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6325766616332602, "mc2_stderr": 0.015487593519142183 }, "harness|arc:challenge|25": { "acc": 0.5477815699658704, "acc_stderr": 0.014544519880633825, "acc_norm": 0.5955631399317406, "acc_norm_stderr": 0.01434203648343618 }, "harness|hellaswag|10": { "acc": 0.6301533559051982, "acc_stderr": 0.004817763581410245, "acc_norm": 0.8227444732125074, "acc_norm_stderr": 0.0038110434120246627 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6052631578947368, "acc_stderr": 0.039777499346220734, "acc_norm": 0.6052631578947368, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.0295822451283843, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.0295822451283843 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5780346820809249, "acc_stderr": 0.0376574669386515, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.024942368931159788, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.024942368931159788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302837, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302837 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.0351760354036101, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.0351760354036101 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.035679697722680495, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.035679697722680495 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.03115626951964683, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.03115626951964683 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5384615384615384, "acc_stderr": 0.025275892070240644, "acc_norm": 0.5384615384615384, "acc_norm_stderr": 0.025275892070240644 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857406, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857406 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5546218487394958, "acc_stderr": 0.0322841062671639, "acc_norm": 0.5546218487394958, "acc_norm_stderr": 0.0322841062671639 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7834862385321101, "acc_stderr": 0.017658710594443128, "acc_norm": 0.7834862385321101, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.030190282453501947, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.030190282453501947 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6143497757847534, "acc_stderr": 0.03266842214289201, "acc_norm": 0.6143497757847534, "acc_norm_stderr": 0.03266842214289201 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094634, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094634 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7055214723926381, "acc_stderr": 0.03581165790474082, "acc_norm": 0.7055214723926381, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.0465614711001235, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.0465614711001235 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7624521072796935, "acc_stderr": 0.015218733046150191, "acc_norm": 0.7624521072796935, "acc_norm_stderr": 0.015218733046150191 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6560693641618497, "acc_stderr": 0.025574123786546665, "acc_norm": 0.6560693641618497, "acc_norm_stderr": 0.025574123786546665 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33519553072625696, "acc_stderr": 0.015788007190185888, "acc_norm": 0.33519553072625696, "acc_norm_stderr": 0.015788007190185888 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6601307189542484, "acc_stderr": 0.027121956071388852, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.027121956071388852 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6720257234726688, "acc_stderr": 0.02666441088693762, "acc_norm": 0.6720257234726688, "acc_norm_stderr": 0.02666441088693762 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6820987654320988, "acc_stderr": 0.025910063528240875, "acc_norm": 0.6820987654320988, "acc_norm_stderr": 0.025910063528240875 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.02973659252642444, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.424380704041721, "acc_stderr": 0.01262334375743002, "acc_norm": 0.424380704041721, "acc_norm_stderr": 0.01262334375743002 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6143790849673203, "acc_stderr": 0.019691459052354022, "acc_norm": 0.6143790849673203, "acc_norm_stderr": 0.019691459052354022 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7711442786069652, "acc_stderr": 0.029705284056772432, "acc_norm": 0.7711442786069652, "acc_norm_stderr": 0.029705284056772432 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.0312678171466318, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.0312678171466318 }, "harness|truthfulqa:mc|0": { "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168106, "mc2": 0.6325766616332602, "mc2_stderr": 0.015487593519142183 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838229 }, "harness|gsm8k|5": { "acc": 0.3752843062926459, "acc_stderr": 0.013337170545742934 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1
[ "region:us" ]
2024-01-21T06:43:17+00:00
{"pretty_name": "Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T06:47:29.951488](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-lp-e1/blob/main/results_2024-01-21T06-47-29.951488.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.59052978065543,\n \"acc_stderr\": 0.03349268505074206,\n \"acc_norm\": 0.5952047695238794,\n \"acc_norm_stderr\": 0.03418111471832376,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6325766616332602,\n \"mc2_stderr\": 0.015487593519142183\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633825,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.01434203648343618\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6301533559051982,\n \"acc_stderr\": 0.004817763581410245,\n \"acc_norm\": 0.8227444732125074,\n \"acc_norm_stderr\": 0.0038110434120246627\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n \"acc_stderr\": 0.015218733046150191,\n \"acc_norm\": 0.7624521072796935,\n \"acc_norm_stderr\": 0.015218733046150191\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.015788007190185888,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.015788007190185888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240875,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240875\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n \"acc_stderr\": 0.01262334375743002,\n \"acc_norm\": 0.424380704041721,\n \"acc_norm_stderr\": 0.01262334375743002\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354022,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.6325766616332602,\n \"mc2_stderr\": 0.015487593519142183\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3752843062926459,\n \"acc_stderr\": 0.013337170545742934\n }\n}\n```", "repo_url": "https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-41-01.110110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["**/details_harness|winogrande|5_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["**/details_harness|winogrande|5_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T06-47-29.951488.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T06_41_01.110110", "path": ["results_2024-01-21T06-41-01.110110.parquet"]}, {"split": "2024_01_21T06_47_29.951488", "path": ["results_2024-01-21T06-47-29.951488.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T06-47-29.951488.parquet"]}]}]}
2024-01-21T06:50:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1 Dataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T06:47:29.951488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:47:29.951488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-lp-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T06:47:29.951488(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bd04fc7fb92e9642ec1117535c4fb13c9f27bcdb
# Dataset Card for Evaluation run of 222gate/TinyMistral-248Mx4-MOE <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [222gate/TinyMistral-248Mx4-MOE](https://huggingface.co/222gate/TinyMistral-248Mx4-MOE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T07:05:45.702729](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE/blob/main/results_2024-01-21T07-05-45.702729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24830542907208702, "acc_stderr": 0.030471240073543585, "acc_norm": 0.24917865866615294, "acc_norm_stderr": 0.03128580366341738, "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.4865533579688347, "mc2_stderr": 0.01667138127210037 }, "harness|arc:challenge|25": { "acc": 0.2235494880546075, "acc_stderr": 0.012174896631202607, "acc_norm": 0.295221843003413, "acc_norm_stderr": 0.013329750293382316 }, "harness|hellaswag|10": { "acc": 0.2561242780322645, "acc_stderr": 0.00435599209003099, "acc_norm": 0.25712009559848636, "acc_norm_stderr": 0.004361529679492746 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.24342105263157895, "acc_stderr": 0.034923496688842384, "acc_norm": 0.24342105263157895, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2830188679245283, "acc_stderr": 0.027724236492700904, "acc_norm": 0.2830188679245283, "acc_norm_stderr": 0.027724236492700904 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03476590104304134, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2774566473988439, "acc_stderr": 0.03414014007044036, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.03414014007044036 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.044405219061793275, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.044405219061793275 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2127659574468085, "acc_stderr": 0.02675439134803976, "acc_norm": 0.2127659574468085, "acc_norm_stderr": 0.02675439134803976 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.0433913832257986, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.0433913832257986 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2689655172413793, "acc_stderr": 0.03695183311650232, "acc_norm": 0.2689655172413793, "acc_norm_stderr": 0.03695183311650232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.022789673145776575, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.022789673145776575 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23809523809523808, "acc_stderr": 0.03809523809523812, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.03809523809523812 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.267741935483871, "acc_stderr": 0.025189006660212378, "acc_norm": 0.267741935483871, "acc_norm_stderr": 0.025189006660212378 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.22660098522167488, "acc_stderr": 0.02945486383529297, "acc_norm": 0.22660098522167488, "acc_norm_stderr": 0.02945486383529297 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.13, "acc_stderr": 0.0337997668989631, "acc_norm": 0.13, "acc_norm_stderr": 0.0337997668989631 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23636363636363636, "acc_stderr": 0.03317505930009179, "acc_norm": 0.23636363636363636, "acc_norm_stderr": 0.03317505930009179 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.20707070707070707, "acc_stderr": 0.028869778460267045, "acc_norm": 0.20707070707070707, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3471502590673575, "acc_stderr": 0.034356961683613546, "acc_norm": 0.3471502590673575, "acc_norm_stderr": 0.034356961683613546 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2743589743589744, "acc_stderr": 0.022622765767493214, "acc_norm": 0.2743589743589744, "acc_norm_stderr": 0.022622765767493214 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.026719240783712173, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.026719240783712173 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2815126050420168, "acc_stderr": 0.029213549414372146, "acc_norm": 0.2815126050420168, "acc_norm_stderr": 0.029213549414372146 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696545, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696545 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3100917431192661, "acc_stderr": 0.01983084968443975, "acc_norm": 0.3100917431192661, "acc_norm_stderr": 0.01983084968443975 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.20833333333333334, "acc_stderr": 0.027696910713093936, "acc_norm": 0.20833333333333334, "acc_norm_stderr": 0.027696910713093936 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23039215686274508, "acc_stderr": 0.029554292605695046, "acc_norm": 0.23039215686274508, "acc_norm_stderr": 0.029554292605695046 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25738396624472576, "acc_stderr": 0.0284588209914603, "acc_norm": 0.25738396624472576, "acc_norm_stderr": 0.0284588209914603 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.19282511210762332, "acc_stderr": 0.02647824096048936, "acc_norm": 0.19282511210762332, "acc_norm_stderr": 0.02647824096048936 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.039153454088478354, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.039153454088478354 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2066115702479339, "acc_stderr": 0.03695980128098825, "acc_norm": 0.2066115702479339, "acc_norm_stderr": 0.03695980128098825 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2037037037037037, "acc_stderr": 0.03893542518824847, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.24539877300613497, "acc_stderr": 0.03380939813943354, "acc_norm": 0.24539877300613497, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.040598672469526864, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.040598672469526864 }, "harness|hendrycksTest-management|5": { "acc": 0.32038834951456313, "acc_stderr": 0.0462028408228004, "acc_norm": 0.32038834951456313, "acc_norm_stderr": 0.0462028408228004 }, "harness|hendrycksTest-marketing|5": { "acc": 0.20512820512820512, "acc_stderr": 0.026453508054040356, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.026453508054040356 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.14, "acc_stderr": 0.0348735088019777, "acc_norm": 0.14, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.25722543352601157, "acc_stderr": 0.02353292543104429, "acc_norm": 0.25722543352601157, "acc_norm_stderr": 0.02353292543104429 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.264804469273743, "acc_stderr": 0.01475690648326066, "acc_norm": 0.264804469273743, "acc_norm_stderr": 0.01475690648326066 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.27450980392156865, "acc_stderr": 0.02555316999182653, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.02555316999182653 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.19935691318327975, "acc_stderr": 0.022691033780549656, "acc_norm": 0.19935691318327975, "acc_norm_stderr": 0.022691033780549656 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2345679012345679, "acc_stderr": 0.02357688174400572, "acc_norm": 0.2345679012345679, "acc_norm_stderr": 0.02357688174400572 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24113475177304963, "acc_stderr": 0.025518731049537762, "acc_norm": 0.24113475177304963, "acc_norm_stderr": 0.025518731049537762 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2588005215123859, "acc_stderr": 0.01118610904656461, "acc_norm": 0.2588005215123859, "acc_norm_stderr": 0.01118610904656461 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3272058823529412, "acc_stderr": 0.028501452860396563, "acc_norm": 0.3272058823529412, "acc_norm_stderr": 0.028501452860396563 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24673202614379086, "acc_stderr": 0.0174408203674025, "acc_norm": 0.24673202614379086, "acc_norm_stderr": 0.0174408203674025 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2909090909090909, "acc_stderr": 0.04350271442923243, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27755102040816326, "acc_stderr": 0.02866685779027465, "acc_norm": 0.27755102040816326, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.03076944496729602, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.03076944496729602 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.21084337349397592, "acc_stderr": 0.031755547866299194, "acc_norm": 0.21084337349397592, "acc_norm_stderr": 0.031755547866299194 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.27485380116959063, "acc_stderr": 0.03424042924691582, "acc_norm": 0.27485380116959063, "acc_norm_stderr": 0.03424042924691582 }, "harness|truthfulqa:mc|0": { "mc1": 0.24357405140758873, "mc1_stderr": 0.01502635482491078, "mc2": 0.4865533579688347, "mc2_stderr": 0.01667138127210037 }, "harness|winogrande|5": { "acc": 0.5177584846093133, "acc_stderr": 0.014043619596174962 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE
[ "region:us" ]
2024-01-21T07:08:05+00:00
{"pretty_name": "Evaluation run of 222gate/TinyMistral-248Mx4-MOE", "dataset_summary": "Dataset automatically created during the evaluation run of model [222gate/TinyMistral-248Mx4-MOE](https://huggingface.co/222gate/TinyMistral-248Mx4-MOE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T07:05:45.702729](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE/blob/main/results_2024-01-21T07-05-45.702729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24830542907208702,\n \"acc_stderr\": 0.030471240073543585,\n \"acc_norm\": 0.24917865866615294,\n \"acc_norm_stderr\": 0.03128580366341738,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4865533579688347,\n \"mc2_stderr\": 0.01667138127210037\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202607,\n \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382316\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2561242780322645,\n \"acc_stderr\": 0.00435599209003099,\n \"acc_norm\": 0.25712009559848636,\n \"acc_norm_stderr\": 0.004361529679492746\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.02675439134803976,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.02675439134803976\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776575,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.025189006660212378,\n \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529297,\n \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529297\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.13,\n \"acc_stderr\": 0.0337997668989631,\n \"acc_norm\": 0.13,\n \"acc_norm_stderr\": 0.0337997668989631\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.022622765767493214,\n \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.022622765767493214\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372146,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372146\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3100917431192661,\n \"acc_stderr\": 0.01983084968443975,\n \"acc_norm\": 0.3100917431192661,\n \"acc_norm_stderr\": 0.01983084968443975\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695046,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.0284588209914603,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.0284588209914603\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098825,\n \"acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.040598672469526864,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.040598672469526864\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.0462028408228004,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.0462028408228004\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.026453508054040356,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.026453508054040356\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104429,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104429\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.01475690648326066,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.01475690648326066\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182653,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182653\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.02357688174400572,\n \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.02357688174400572\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n \"acc_stderr\": 0.01118610904656461,\n \"acc_norm\": 0.2588005215123859,\n \"acc_norm_stderr\": 0.01118610904656461\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396563,\n \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396563\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691582,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691582\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4865533579688347,\n \"mc2_stderr\": 0.01667138127210037\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5177584846093133,\n \"acc_stderr\": 0.014043619596174962\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/222gate/TinyMistral-248Mx4-MOE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["**/details_harness|winogrande|5_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T07-05-45.702729.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T07_05_45.702729", "path": ["results_2024-01-21T07-05-45.702729.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T07-05-45.702729.parquet"]}]}]}
2024-01-21T07:08:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 222gate/TinyMistral-248Mx4-MOE Dataset automatically created during the evaluation run of model 222gate/TinyMistral-248Mx4-MOE on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T07:05:45.702729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of 222gate/TinyMistral-248Mx4-MOE\n\n\n\nDataset automatically created during the evaluation run of model 222gate/TinyMistral-248Mx4-MOE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:05:45.702729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 222gate/TinyMistral-248Mx4-MOE\n\n\n\nDataset automatically created during the evaluation run of model 222gate/TinyMistral-248Mx4-MOE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:05:45.702729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0b65b2b9e30916277c726f5dc1398732a07280ee
# Dataset Card for "NER_processed_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SKT27182/NER_processed_data
[ "region:us" ]
2024-01-21T07:22:13+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "tags", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "dataset_num", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 6967086.513065097, "num_examples": 15766}, {"name": "test", "num_bytes": 1742434.4869349028, "num_examples": 3943}], "download_size": 2820200, "dataset_size": 8709521.0}}
2024-01-21T07:22:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "NER_processed_data" More Information needed
[ "# Dataset Card for \"NER_processed_data\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"NER_processed_data\"\n\nMore Information needed" ]
4632c6df067de5f79db02b3c46477fb89c90ebff
# Dataset Card for Evaluation run of PetroGPT/Severus-7B-DPO <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [PetroGPT/Severus-7B-DPO](https://huggingface.co/PetroGPT/Severus-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PetroGPT__Severus-7B-DPO", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T07:34:31.243041](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Severus-7B-DPO/blob/main/results_2024-01-21T07-34-31.243041.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6537254975299045, "acc_stderr": 0.03206253983450032, "acc_norm": 0.6539950421188907, "acc_norm_stderr": 0.03271890557687562, "mc1": 0.48592411260709917, "mc1_stderr": 0.017496563717042793, "mc2": 0.6441479372366624, "mc2_stderr": 0.015403047466723335 }, "harness|arc:challenge|25": { "acc": 0.6723549488054608, "acc_stderr": 0.013715847940719339, "acc_norm": 0.7022184300341296, "acc_norm_stderr": 0.01336308010724448 }, "harness|hellaswag|10": { "acc": 0.6904999004182434, "acc_stderr": 0.0046134277452095146, "acc_norm": 0.8709420434176459, "acc_norm_stderr": 0.003345788905262948 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047423976, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047423976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768077, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.0255428468174005, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.0255428468174005 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.021995311963644237, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.021995311963644237 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887037, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887037 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590167, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590167 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.822477650063857, "acc_stderr": 0.013664230995834841, "acc_norm": 0.822477650063857, "acc_norm_stderr": 0.013664230995834841 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4011173184357542, "acc_stderr": 0.01639222189940708, "acc_norm": 0.4011173184357542, "acc_norm_stderr": 0.01639222189940708 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712992, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712992 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653349, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653349 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031215, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031215 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507205, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507205 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.028782108105401712, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.028782108105401712 }, "harness|truthfulqa:mc|0": { "mc1": 0.48592411260709917, "mc1_stderr": 0.017496563717042793, "mc2": 0.6441479372366624, "mc2_stderr": 0.015403047466723335 }, "harness|winogrande|5": { "acc": 0.8066298342541437, "acc_stderr": 0.011099796645920524 }, "harness|gsm8k|5": { "acc": 0.6952236542835482, "acc_stderr": 0.012679297549515437 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_PetroGPT__Severus-7B-DPO
[ "region:us" ]
2024-01-21T07:36:46+00:00
{"pretty_name": "Evaluation run of PetroGPT/Severus-7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [PetroGPT/Severus-7B-DPO](https://huggingface.co/PetroGPT/Severus-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PetroGPT__Severus-7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T07:34:31.243041](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__Severus-7B-DPO/blob/main/results_2024-01-21T07-34-31.243041.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537254975299045,\n \"acc_stderr\": 0.03206253983450032,\n \"acc_norm\": 0.6539950421188907,\n \"acc_norm_stderr\": 0.03271890557687562,\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6441479372366624,\n \"mc2_stderr\": 0.015403047466723335\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.013715847940719339,\n \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.01336308010724448\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6904999004182434,\n \"acc_stderr\": 0.0046134277452095146,\n \"acc_norm\": 0.8709420434176459,\n \"acc_norm_stderr\": 0.003345788905262948\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401712,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401712\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6441479372366624,\n \"mc2_stderr\": 0.015403047466723335\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515437\n }\n}\n```", "repo_url": "https://huggingface.co/PetroGPT/Severus-7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-34-31.243041.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["**/details_harness|winogrande|5_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T07-34-31.243041.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T07_34_31.243041", "path": ["results_2024-01-21T07-34-31.243041.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T07-34-31.243041.parquet"]}]}]}
2024-01-21T07:37:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PetroGPT/Severus-7B-DPO Dataset automatically created during the evaluation run of model PetroGPT/Severus-7B-DPO on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T07:34:31.243041(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of PetroGPT/Severus-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/Severus-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:34:31.243041(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PetroGPT/Severus-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/Severus-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:34:31.243041(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
981fe356969ec6815bc52be17c0d1c74004782df
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.8 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.8](https://huggingface.co/andysalerno/openchat-nectar-0.8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T07:37:25.188045](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8/blob/main/results_2024-01-21T07-37-25.188045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6539948201078623, "acc_stderr": 0.03186024696025248, "acc_norm": 0.6547535995818915, "acc_norm_stderr": 0.03251445540703094, "mc1": 0.3574051407588739, "mc1_stderr": 0.01677659967672941, "mc2": 0.5226230452646764, "mc2_stderr": 0.015325117203952783 }, "harness|arc:challenge|25": { "acc": 0.6237201365187713, "acc_stderr": 0.014157022555407158, "acc_norm": 0.6578498293515358, "acc_norm_stderr": 0.013864152159177275 }, "harness|hellaswag|10": { "acc": 0.6347341167098187, "acc_stderr": 0.00480520579872457, "acc_norm": 0.8305118502290381, "acc_norm_stderr": 0.0037441574425365596 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.02794321998933714, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.02794321998933714 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6936416184971098, "acc_stderr": 0.03514942551267438, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.03514942551267438 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5079365079365079, "acc_stderr": 0.044715725362943486, "acc_norm": 0.5079365079365079, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.02302589961718872, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.02302589961718872 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479048, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479048 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033477, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033477 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.02938162072646507, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.02938162072646507 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6974789915966386, "acc_stderr": 0.02983796238829194, "acc_norm": 0.6974789915966386, "acc_norm_stderr": 0.02983796238829194 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8495412844036697, "acc_stderr": 0.015328563932669237, "acc_norm": 0.8495412844036697, "acc_norm_stderr": 0.015328563932669237 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944867, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944867 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.03050028317654585, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.03050028317654585 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8365261813537676, "acc_stderr": 0.013223928616741624, "acc_norm": 0.8365261813537676, "acc_norm_stderr": 0.013223928616741624 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7630057803468208, "acc_stderr": 0.02289408248992599, "acc_norm": 0.7630057803468208, "acc_norm_stderr": 0.02289408248992599 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25027932960893856, "acc_stderr": 0.01448750085285042, "acc_norm": 0.25027932960893856, "acc_norm_stderr": 0.01448750085285042 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7623456790123457, "acc_stderr": 0.023683591837008557, "acc_norm": 0.7623456790123457, "acc_norm_stderr": 0.023683591837008557 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4921773142112125, "acc_stderr": 0.012768673076111898, "acc_norm": 0.4921773142112125, "acc_norm_stderr": 0.012768673076111898 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7279411764705882, "acc_stderr": 0.02703304115168146, "acc_norm": 0.7279411764705882, "acc_norm_stderr": 0.02703304115168146 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069446, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069446 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174937, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174937 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578334, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578334 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197768, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197768 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072767, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072767 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.01677659967672941, "mc2": 0.5226230452646764, "mc2_stderr": 0.015325117203952783 }, "harness|winogrande|5": { "acc": 0.8161010260457774, "acc_stderr": 0.01088791601330589 }, "harness|gsm8k|5": { "acc": 0.6770280515542078, "acc_stderr": 0.012880360794851806 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8
[ "region:us" ]
2024-01-21T07:39:42+00:00
{"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.8", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.8](https://huggingface.co/andysalerno/openchat-nectar-0.8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T07:37:25.188045](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8/blob/main/results_2024-01-21T07-37-25.188045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539948201078623,\n \"acc_stderr\": 0.03186024696025248,\n \"acc_norm\": 0.6547535995818915,\n \"acc_norm_stderr\": 0.03251445540703094,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5226230452646764,\n \"mc2_stderr\": 0.015325117203952783\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407158,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6347341167098187,\n \"acc_stderr\": 0.00480520579872457,\n \"acc_norm\": 0.8305118502290381,\n \"acc_norm_stderr\": 0.0037441574425365596\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944867,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944867\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.01448750085285042,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.01448750085285042\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.012768673076111898,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.012768673076111898\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5226230452646764,\n \"mc2_stderr\": 0.015325117203952783\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.01088791601330589\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.012880360794851806\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.8", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["**/details_harness|winogrande|5_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T07-37-25.188045.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T07_37_25.188045", "path": ["results_2024-01-21T07-37-25.188045.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T07-37-25.188045.parquet"]}]}]}
2024-01-21T07:40:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.8 Dataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.8 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T07:37:25.188045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.8\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:37:25.188045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.8\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:37:25.188045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
8b03f074466e085b51a0d7b52127b53fb510b2ae
# Dataset Card for Evaluation run of Kquant03/BurningBruce-SOLAR-8x10.7B-bf16 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Kquant03/BurningBruce-SOLAR-8x10.7B-bf16](https://huggingface.co/Kquant03/BurningBruce-SOLAR-8x10.7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Kquant03__BurningBruce-SOLAR-8x10.7B-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T07:45:02.704835](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__BurningBruce-SOLAR-8x10.7B-bf16/blob/main/results_2024-01-21T07-45-02.704835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6657906302690986, "acc_stderr": 0.03148810669105293, "acc_norm": 0.6668796900256831, "acc_norm_stderr": 0.0321268289383682, "mc1": 0.5507955936352509, "mc1_stderr": 0.017412941986115295, "mc2": 0.686693861909958, "mc2_stderr": 0.01517360670121969 }, "harness|arc:challenge|25": { "acc": 0.6655290102389079, "acc_stderr": 0.013787460322441374, "acc_norm": 0.6911262798634812, "acc_norm_stderr": 0.013501770929344003 }, "harness|hellaswag|10": { "acc": 0.696176060545708, "acc_stderr": 0.004589676274079085, "acc_norm": 0.8781119298944433, "acc_norm_stderr": 0.003264878737586885 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.75, "acc_stderr": 0.03523807393012047, "acc_norm": 0.75, "acc_norm_stderr": 0.03523807393012047 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.73, "acc_stderr": 0.04461960433384741, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.625531914893617, "acc_stderr": 0.03163910665367291, "acc_norm": 0.625531914893617, "acc_norm_stderr": 0.03163910665367291 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6344827586206897, "acc_stderr": 0.04013124195424386, "acc_norm": 0.6344827586206897, "acc_norm_stderr": 0.04013124195424386 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47354497354497355, "acc_stderr": 0.025715239811346758, "acc_norm": 0.47354497354497355, "acc_norm_stderr": 0.025715239811346758 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8193548387096774, "acc_stderr": 0.021886178567172534, "acc_norm": 0.8193548387096774, "acc_norm_stderr": 0.021886178567172534 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8121212121212121, "acc_stderr": 0.03050193405942914, "acc_norm": 0.8121212121212121, "acc_norm_stderr": 0.03050193405942914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8787878787878788, "acc_stderr": 0.02325315795194208, "acc_norm": 0.8787878787878788, "acc_norm_stderr": 0.02325315795194208 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7184873949579832, "acc_stderr": 0.02921354941437217, "acc_norm": 0.7184873949579832, "acc_norm_stderr": 0.02921354941437217 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.01540508439315707, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.01540508439315707 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5648148148148148, "acc_stderr": 0.03381200005643527, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.03381200005643527 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.0251956584289318, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.0251956584289318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8481012658227848, "acc_stderr": 0.023363878096632446, "acc_norm": 0.8481012658227848, "acc_norm_stderr": 0.023363878096632446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572213, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572213 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.035208939510976534, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.035208939510976534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973143, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973143 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38100558659217876, "acc_stderr": 0.016242028834053613, "acc_norm": 0.38100558659217876, "acc_norm_stderr": 0.016242028834053613 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7679738562091504, "acc_stderr": 0.024170840879340863, "acc_norm": 0.7679738562091504, "acc_norm_stderr": 0.024170840879340863 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.729903536977492, "acc_stderr": 0.02521804037341062, "acc_norm": 0.729903536977492, "acc_norm_stderr": 0.02521804037341062 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7901234567901234, "acc_stderr": 0.02265834408598137, "acc_norm": 0.7901234567901234, "acc_norm_stderr": 0.02265834408598137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48239895697522817, "acc_stderr": 0.012762321298823645, "acc_norm": 0.48239895697522817, "acc_norm_stderr": 0.012762321298823645 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7536764705882353, "acc_stderr": 0.02617343857052, "acc_norm": 0.7536764705882353, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6993464052287581, "acc_stderr": 0.01855063450295296, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.01855063450295296 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.027833023871399683, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.027833023871399683 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5783132530120482, "acc_stderr": 0.038444531817709175, "acc_norm": 0.5783132530120482, "acc_norm_stderr": 0.038444531817709175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03126781714663179, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03126781714663179 }, "harness|truthfulqa:mc|0": { "mc1": 0.5507955936352509, "mc1_stderr": 0.017412941986115295, "mc2": 0.686693861909958, "mc2_stderr": 0.01517360670121969 }, "harness|winogrande|5": { "acc": 0.8334648776637726, "acc_stderr": 0.010470796496781084 }, "harness|gsm8k|5": { "acc": 0.6413949962092494, "acc_stderr": 0.013210317364134035 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Kquant03__BurningBruce-SOLAR-8x10.7B-bf16
[ "region:us" ]
2024-01-21T07:47:20+00:00
{"pretty_name": "Evaluation run of Kquant03/BurningBruce-SOLAR-8x10.7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/BurningBruce-SOLAR-8x10.7B-bf16](https://huggingface.co/Kquant03/BurningBruce-SOLAR-8x10.7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__BurningBruce-SOLAR-8x10.7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T07:45:02.704835](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__BurningBruce-SOLAR-8x10.7B-bf16/blob/main/results_2024-01-21T07-45-02.704835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6657906302690986,\n \"acc_stderr\": 0.03148810669105293,\n \"acc_norm\": 0.6668796900256831,\n \"acc_norm_stderr\": 0.0321268289383682,\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.017412941986115295,\n \"mc2\": 0.686693861909958,\n \"mc2_stderr\": 0.01517360670121969\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344003\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.696176060545708,\n \"acc_stderr\": 0.004589676274079085,\n \"acc_norm\": 0.8781119298944433,\n \"acc_norm_stderr\": 0.003264878737586885\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.01540508439315707,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.01540508439315707\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973143,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973143\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n \"acc_stderr\": 0.016242028834053613,\n \"acc_norm\": 0.38100558659217876,\n \"acc_norm_stderr\": 0.016242028834053613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340863,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340863\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.02265834408598137,\n \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.02265834408598137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n \"acc_stderr\": 0.012762321298823645,\n \"acc_norm\": 0.48239895697522817,\n \"acc_norm_stderr\": 0.012762321298823645\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.01855063450295296,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.01855063450295296\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.017412941986115295,\n \"mc2\": 0.686693861909958,\n \"mc2_stderr\": 0.01517360670121969\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781084\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6413949962092494,\n \"acc_stderr\": 0.013210317364134035\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/BurningBruce-SOLAR-8x10.7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T07-45-02.704835.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["**/details_harness|winogrande|5_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T07-45-02.704835.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T07_45_02.704835", "path": ["results_2024-01-21T07-45-02.704835.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T07-45-02.704835.parquet"]}]}]}
2024-01-21T07:47:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Kquant03/BurningBruce-SOLAR-8x10.7B-bf16 Dataset automatically created during the evaluation run of model Kquant03/BurningBruce-SOLAR-8x10.7B-bf16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T07:45:02.704835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Kquant03/BurningBruce-SOLAR-8x10.7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/BurningBruce-SOLAR-8x10.7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:45:02.704835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Kquant03/BurningBruce-SOLAR-8x10.7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/BurningBruce-SOLAR-8x10.7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T07:45:02.704835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
73ba1cb72d283ce2916224e6f5f74968229858ce
# Dataset Card for Evaluation run of flemmingmiguel/MDBX-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [flemmingmiguel/MDBX-7B](https://huggingface.co/flemmingmiguel/MDBX-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MDBX-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T08:08:27.552111](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MDBX-7B/blob/main/results_2024-01-21T08-08-27.552111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.655806438283324, "acc_stderr": 0.03200415575634736, "acc_norm": 0.6548887828373608, "acc_norm_stderr": 0.032676368096110006, "mc1": 0.5446756425948592, "mc1_stderr": 0.017433490102538758, "mc2": 0.6818712158396469, "mc2_stderr": 0.015135432675602247 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068744, "acc_norm": 0.7201365187713311, "acc_norm_stderr": 0.013119040897725922 }, "harness|hellaswag|10": { "acc": 0.7108145787691695, "acc_stderr": 0.004524575892952949, "acc_norm": 0.8830910177255527, "acc_norm_stderr": 0.0032065512832573956 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370332, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370332 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.02302589961718872, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.02302589961718872 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.0251956584289318, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.0251956584289318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621115, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621115 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069356, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069356 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4335195530726257, "acc_stderr": 0.016574027219517635, "acc_norm": 0.4335195530726257, "acc_norm_stderr": 0.016574027219517635 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959614, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959614 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869649, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869649 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396556, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396556 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5446756425948592, "mc1_stderr": 0.017433490102538758, "mc2": 0.6818712158396469, "mc2_stderr": 0.015135432675602247 }, "harness|winogrande|5": { "acc": 0.835043409629045, "acc_stderr": 0.010430917468237422 }, "harness|gsm8k|5": { "acc": 0.7217589082638363, "acc_stderr": 0.012343803671422678 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_flemmingmiguel__MDBX-7B
[ "region:us" ]
2024-01-21T08:10:45+00:00
{"pretty_name": "Evaluation run of flemmingmiguel/MDBX-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/MDBX-7B](https://huggingface.co/flemmingmiguel/MDBX-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MDBX-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T08:08:27.552111](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MDBX-7B/blob/main/results_2024-01-21T08-08-27.552111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655806438283324,\n \"acc_stderr\": 0.03200415575634736,\n \"acc_norm\": 0.6548887828373608,\n \"acc_norm_stderr\": 0.032676368096110006,\n \"mc1\": 0.5446756425948592,\n \"mc1_stderr\": 0.017433490102538758,\n \"mc2\": 0.6818712158396469,\n \"mc2_stderr\": 0.015135432675602247\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7108145787691695,\n \"acc_stderr\": 0.004524575892952949,\n \"acc_norm\": 0.8830910177255527,\n \"acc_norm_stderr\": 0.0032065512832573956\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5446756425948592,\n \"mc1_stderr\": 0.017433490102538758,\n \"mc2\": 0.6818712158396469,\n \"mc2_stderr\": 0.015135432675602247\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237422\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \"acc_stderr\": 0.012343803671422678\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/MDBX-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|arc:challenge|25_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|gsm8k|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hellaswag|10_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["**/details_harness|winogrande|5_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T08-08-27.552111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T08_08_27.552111", "path": ["results_2024-01-21T08-08-27.552111.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T08-08-27.552111.parquet"]}]}]}
2024-01-21T08:11:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of flemmingmiguel/MDBX-7B Dataset automatically created during the evaluation run of model flemmingmiguel/MDBX-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T08:08:27.552111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of flemmingmiguel/MDBX-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MDBX-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T08:08:27.552111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of flemmingmiguel/MDBX-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MDBX-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T08:08:27.552111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4550e8cf6a6679a01e9aabc63f2dc7aff5a4fb4d
# Dataset Card for Evaluation run of chargoddard/internlm2-base-20b-llama <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-20b-llama](https://huggingface.co/chargoddard/internlm2-base-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T08:12:11.575065](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama/blob/main/results_2024-01-21T08-12-11.575065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6376672606100185, "acc_stderr": 0.03233501598179968, "acc_norm": 0.6426334526719998, "acc_norm_stderr": 0.0329801483756123, "mc1": 0.2913096695226438, "mc1_stderr": 0.01590598704818483, "mc2": 0.43966281100559496, "mc2_stderr": 0.014256122898440773 }, "harness|arc:challenge|25": { "acc": 0.5878839590443686, "acc_stderr": 0.014383915302225403, "acc_norm": 0.6305460750853242, "acc_norm_stderr": 0.014104578366491888 }, "harness|hellaswag|10": { "acc": 0.6158135829516033, "acc_stderr": 0.004854082479916909, "acc_norm": 0.8210515833499303, "acc_norm_stderr": 0.0038252574352092344 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.037150621549989056, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.037150621549989056 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6358381502890174, "acc_stderr": 0.03669072477416907, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.03669072477416907 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006717, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006717 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036843, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036843 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.031967586978353627, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.031967586978353627 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305528, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305528 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.541871921182266, "acc_stderr": 0.03505630140785741, "acc_norm": 0.541871921182266, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721164, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721164 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026552207828215282, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026552207828215282 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015184, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6282051282051282, "acc_stderr": 0.024503472557110943, "acc_norm": 0.6282051282051282, "acc_norm_stderr": 0.024503472557110943 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977924, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977924 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092437, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092437 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.02786594228663933, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.02786594228663933 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.695067264573991, "acc_stderr": 0.030898610882477515, "acc_norm": 0.695067264573991, "acc_norm_stderr": 0.030898610882477515 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.04039314978724561, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.04039314978724561 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163025, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128136, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128136 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7956577266922095, "acc_stderr": 0.0144191239809319, "acc_norm": 0.7956577266922095, "acc_norm_stderr": 0.0144191239809319 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.02488314057007176, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.02488314057007176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38994413407821227, "acc_stderr": 0.016312376629213067, "acc_norm": 0.38994413407821227, "acc_norm_stderr": 0.016312376629213067 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188947, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188947 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.024288533637726095, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.024288533637726095 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.029427994039419998, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.029427994039419998 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4765319426336376, "acc_stderr": 0.012756161942523365, "acc_norm": 0.4765319426336376, "acc_norm_stderr": 0.012756161942523365 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389844, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389844 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291313, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291313 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.02553843336857833, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.02553843336857833 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.2913096695226438, "mc1_stderr": 0.01590598704818483, "mc2": 0.43966281100559496, "mc2_stderr": 0.014256122898440773 }, "harness|winogrande|5": { "acc": 0.7821625887924231, "acc_stderr": 0.011601066079939324 }, "harness|gsm8k|5": { "acc": 0.44806671721000757, "acc_stderr": 0.013697992668274518 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama
[ "region:us" ]
2024-01-21T08:14:17+00:00
{"pretty_name": "Evaluation run of chargoddard/internlm2-base-20b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-20b-llama](https://huggingface.co/chargoddard/internlm2-base-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T08:12:11.575065](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-20b-llama/blob/main/results_2024-01-21T08-12-11.575065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6376672606100185,\n \"acc_stderr\": 0.03233501598179968,\n \"acc_norm\": 0.6426334526719998,\n \"acc_norm_stderr\": 0.0329801483756123,\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.43966281100559496,\n \"mc2_stderr\": 0.014256122898440773\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6158135829516033,\n \"acc_stderr\": 0.004854082479916909,\n \"acc_norm\": 0.8210515833499303,\n \"acc_norm_stderr\": 0.0038252574352092344\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110943,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110943\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977924,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977924\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.7956577266922095,\n \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n \"acc_stderr\": 0.016312376629213067,\n \"acc_norm\": 0.38994413407821227,\n \"acc_norm_stderr\": 0.016312376629213067\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419998,\n \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419998\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n \"acc_stderr\": 0.012756161942523365,\n \"acc_norm\": 0.4765319426336376,\n \"acc_norm_stderr\": 0.012756161942523365\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291313,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291313\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.43966281100559496,\n \"mc2_stderr\": 0.014256122898440773\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44806671721000757,\n \"acc_stderr\": 0.013697992668274518\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/internlm2-base-20b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|arc:challenge|25_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|gsm8k|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hellaswag|10_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["**/details_harness|winogrande|5_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T08-12-11.575065.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T08_12_11.575065", "path": ["results_2024-01-21T08-12-11.575065.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T08-12-11.575065.parquet"]}]}]}
2024-01-21T08:14:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of chargoddard/internlm2-base-20b-llama Dataset automatically created during the evaluation run of model chargoddard/internlm2-base-20b-llama on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T08:12:11.575065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of chargoddard/internlm2-base-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-base-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T08:12:11.575065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of chargoddard/internlm2-base-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-base-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T08:12:11.575065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bdb7e8032c42fb47ed7a14df31a02785ab363072
# Dataset Card for "fortuna_alpaca_format" This is a blatant rip-off of the Samantha dataset simply because I didn't feel like talking to Samantha. All credit, and I mean all, goes to Eric Hartford and [cognitivecomputations](https://huggingface.co/cognitivecomputations/samantha-data) Also, this is reformatted into Alpaca format and includes a text column in ChatML format for ease of use.
jtatman/fortuna_instruction_format
[ "task_categories:question-answering", "task_categories:conversational", "size_categories:10K<n<100K", "language:en", "license:mit", "dolphin", "samantha", "reformatted", "region:us" ]
2024-01-21T08:16:56+00:00
{"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "conversational"], "pretty_name": "fortuna", "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 51905283, "num_examples": 34687}], "download_size": 20264229, "dataset_size": 51905283}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["dolphin", "samantha", "reformatted"]}
2024-01-23T20:01:39+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #license-mit #dolphin #samantha #reformatted #region-us
# Dataset Card for "fortuna_alpaca_format" This is a blatant rip-off of the Samantha dataset simply because I didn't feel like talking to Samantha. All credit, and I mean all, goes to Eric Hartford and cognitivecomputations Also, this is reformatted into Alpaca format and includes a text column in ChatML format for ease of use.
[ "# Dataset Card for \"fortuna_alpaca_format\"\n\nThis is a blatant rip-off of the Samantha dataset simply because I didn't feel like talking to Samantha.\n\nAll credit, and I mean all, goes to Eric Hartford and cognitivecomputations\n\nAlso, this is reformatted into Alpaca format and includes a text column in ChatML format for ease of use." ]
[ "TAGS\n#task_categories-question-answering #task_categories-conversational #size_categories-10K<n<100K #language-English #license-mit #dolphin #samantha #reformatted #region-us \n", "# Dataset Card for \"fortuna_alpaca_format\"\n\nThis is a blatant rip-off of the Samantha dataset simply because I didn't feel like talking to Samantha.\n\nAll credit, and I mean all, goes to Eric Hartford and cognitivecomputations\n\nAlso, this is reformatted into Alpaca format and includes a text column in ChatML format for ease of use." ]
8fdbd0ea7ede5a6e17ecdac320084706abfe8adc
# Dataset Card for "open-instruct-uncensored-alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jtatman/open-instruct-uncensored-alpaca
[ "region:us" ]
2024-01-21T08:27:10+00:00
{"dataset_info": {"features": [{"name": "user", "dtype": "string"}, {"name": "assistant", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2040635964, "num_examples": 1255224}], "download_size": 922350127, "dataset_size": 2040635964}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T08:34:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "open-instruct-uncensored-alpaca" More Information needed
[ "# Dataset Card for \"open-instruct-uncensored-alpaca\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"open-instruct-uncensored-alpaca\"\n\nMore Information needed" ]
5971dbfd1d58c952fe80bf01c944d1a79d702950
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T09:04:23.323517](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e1/blob/main/results_2024-01-21T09-04-23.323517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6066822120870259, "acc_stderr": 0.03318943094704429, "acc_norm": 0.6116640035982913, "acc_norm_stderr": 0.03386329646645478, "mc1": 0.4847001223990208, "mc1_stderr": 0.017495304473187902, "mc2": 0.6471570787741905, "mc2_stderr": 0.015262709036509099 }, "harness|arc:challenge|25": { "acc": 0.5622866894197952, "acc_stderr": 0.01449757388110828, "acc_norm": 0.60580204778157, "acc_norm_stderr": 0.01428052266746732 }, "harness|hellaswag|10": { "acc": 0.637024497112129, "acc_stderr": 0.004798751281560844, "acc_norm": 0.8332005576578371, "acc_norm_stderr": 0.0037203482062126898 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.029067220146644826, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.029067220146644826 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6736111111111112, "acc_stderr": 0.03921067198982266, "acc_norm": 0.6736111111111112, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404947, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594963, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594963 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067877, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067877 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377563, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377563 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6870967741935484, "acc_stderr": 0.026377567028645858, "acc_norm": 0.6870967741935484, "acc_norm_stderr": 0.026377567028645858 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175008, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939098, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.02985751567338642, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.02985751567338642 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.02541634309630643, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.02541634309630643 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.558974358974359, "acc_stderr": 0.025174048384000745, "acc_norm": 0.558974358974359, "acc_norm_stderr": 0.025174048384000745 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652458, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652458 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8, "acc_stderr": 0.01714985851425095, "acc_norm": 0.8, "acc_norm_stderr": 0.01714985851425095 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502326, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502326 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145628, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145628 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.028304657943035303, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.028304657943035303 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.042365112580946336, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615624, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615624 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7637292464878672, "acc_stderr": 0.015190473717037495, "acc_norm": 0.7637292464878672, "acc_norm_stderr": 0.015190473717037495 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6734104046242775, "acc_stderr": 0.02524826477424284, "acc_norm": 0.6734104046242775, "acc_norm_stderr": 0.02524826477424284 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3474860335195531, "acc_stderr": 0.01592556406020815, "acc_norm": 0.3474860335195531, "acc_norm_stderr": 0.01592556406020815 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.026643278474508755, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.026643278474508755 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6752411575562701, "acc_stderr": 0.026596782287697043, "acc_norm": 0.6752411575562701, "acc_norm_stderr": 0.026596782287697043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.025630824975621344, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.025630824975621344 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4485006518904824, "acc_stderr": 0.012702317490559802, "acc_norm": 0.4485006518904824, "acc_norm_stderr": 0.012702317490559802 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5955882352941176, "acc_stderr": 0.02981263070156974, "acc_norm": 0.5955882352941176, "acc_norm_stderr": 0.02981263070156974 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6176470588235294, "acc_stderr": 0.01965992249362335, "acc_norm": 0.6176470588235294, "acc_norm_stderr": 0.01965992249362335 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.4847001223990208, "mc1_stderr": 0.017495304473187902, "mc2": 0.6471570787741905, "mc2_stderr": 0.015262709036509099 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856544 }, "harness|gsm8k|5": { "acc": 0.39196360879454134, "acc_stderr": 0.013447140886023825 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e1
[ "region:us" ]
2024-01-21T09:06:40+00:00
{"pretty_name": "Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1", "dataset_summary": "Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T09:04:23.323517](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e1/blob/main/results_2024-01-21T09-04-23.323517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6066822120870259,\n \"acc_stderr\": 0.03318943094704429,\n \"acc_norm\": 0.6116640035982913,\n \"acc_norm_stderr\": 0.03386329646645478,\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6471570787741905,\n \"mc2_stderr\": 0.015262709036509099\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.01449757388110828,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.637024497112129,\n \"acc_stderr\": 0.004798751281560844,\n \"acc_norm\": 0.8332005576578371,\n \"acc_norm_stderr\": 0.0037203482062126898\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.026377567028645858,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.026377567028645858\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n \"acc_stderr\": 0.015190473717037495,\n \"acc_norm\": 0.7637292464878672,\n \"acc_norm_stderr\": 0.015190473717037495\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n \"acc_stderr\": 0.012702317490559802,\n \"acc_norm\": 0.4485006518904824,\n \"acc_norm_stderr\": 0.012702317490559802\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6471570787741905,\n \"mc2_stderr\": 0.015262709036509099\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39196360879454134,\n \"acc_stderr\": 0.013447140886023825\n }\n}\n```", "repo_url": "https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|arc:challenge|25_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|gsm8k|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hellaswag|10_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T09-04-23.323517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["**/details_harness|winogrande|5_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T09-04-23.323517.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T09_04_23.323517", "path": ["results_2024-01-21T09-04-23.323517.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T09-04-23.323517.parquet"]}]}]}
2024-01-21T09:07:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1 Dataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T09:04:23.323517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T09:04:23.323517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T09:04:23.323517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
28450fa5af5a3391b1bf9dafba04c6a73213dafe
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information. https://huggingface.co/datasets/biglab/webui-350k ``` from datasets import load_dataset dataset = load_dataset("biglab/webui-350k-elements") ```
biglab/webui-350k-elements
[ "region:us" ]
2024-01-21T09:31:12+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "labels", "sequence": {"sequence": "string"}}, {"name": "contentBoxes", "sequence": {"sequence": "float64"}}, {"name": "paddingBoxes", "sequence": {"sequence": "float64"}}, {"name": "borderBoxes", "sequence": {"sequence": "float64"}}, {"name": "marginBoxes", "sequence": {"sequence": "float64"}}, {"name": "key_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 75048087304.132, "num_examples": 1020062}], "download_size": 68247972580, "dataset_size": 75048087304.132}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-23T02:36:30+00:00
[]
[]
TAGS #region-us
This is a repacked version of a split of the WebUI dataset into the HuggingFace datasets format. This repacked version focuses on the web element locations/labels and does not contain all data in the original dataset (e.g., element styles and full source code). Please see the original page for this data and more information about the dataset, including a related publication and copyright/license information. URL
[]
[ "TAGS\n#region-us \n" ]
78f92e2c0a04a44570a05c071f16027a5b08dca8
This is a temporary dataset for testing purpose
Sadik-Sikder/heritage_of_comilla
[ "region:us" ]
2024-01-21T09:37:47+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6837346.0, "num_examples": 49}], "download_size": 6676360, "dataset_size": 6837346.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-21T09:43:14+00:00
[]
[]
TAGS #region-us
This is a temporary dataset for testing purpose
[]
[ "TAGS\n#region-us \n" ]
9d563bec380c5b8ccace81bfd94fe29fc7c719a6
# Dataset Card for Evaluation run of FelixChao/Severus-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [FelixChao/Severus-7B](https://huggingface.co/FelixChao/Severus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_FelixChao__Severus-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T10:08:30.529941](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Severus-7B/blob/main/results_2024-01-21T10-08-30.529941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6564105927142331, "acc_stderr": 0.032011062761424514, "acc_norm": 0.6561944704621784, "acc_norm_stderr": 0.03267453714628516, "mc1": 0.45532435740514077, "mc1_stderr": 0.017433490102538765, "mc2": 0.6136319274258737, "mc2_stderr": 0.015253515428580656 }, "harness|arc:challenge|25": { "acc": 0.6561433447098977, "acc_stderr": 0.013880644570156217, "acc_norm": 0.6843003412969283, "acc_norm_stderr": 0.013582571095815291 }, "harness|hellaswag|10": { "acc": 0.686516630153356, "acc_stderr": 0.004629608863272306, "acc_norm": 0.8688508265285799, "acc_norm_stderr": 0.00336873543416138 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7283018867924528, "acc_stderr": 0.027377706624670713, "acc_norm": 0.7283018867924528, "acc_norm_stderr": 0.027377706624670713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.02550648169813821, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.02550648169813821 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.02247325333276877, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.02247325333276877 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083008, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083008 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977927, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977927 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8550458715596331, "acc_stderr": 0.015094215699700472, "acc_norm": 0.8550458715596331, "acc_norm_stderr": 0.015094215699700472 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.025524722324553353, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.025524722324553353 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545546, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545546 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.423463687150838, "acc_stderr": 0.016525425898773496, "acc_norm": 0.423463687150838, "acc_norm_stderr": 0.016525425898773496 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214961, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214961 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396556, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396556 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.696078431372549, "acc_stderr": 0.01860755213127983, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.01860755213127983 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.45532435740514077, "mc1_stderr": 0.017433490102538765, "mc2": 0.6136319274258737, "mc2_stderr": 0.015253515428580656 }, "harness|winogrande|5": { "acc": 0.8089976322020521, "acc_stderr": 0.011047808761510423 }, "harness|gsm8k|5": { "acc": 0.7270659590598939, "acc_stderr": 0.012270381151108754 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_FelixChao__Severus-7B
[ "region:us" ]
2024-01-21T10:10:48+00:00
{"pretty_name": "Evaluation run of FelixChao/Severus-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Severus-7B](https://huggingface.co/FelixChao/Severus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Severus-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T10:08:30.529941](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Severus-7B/blob/main/results_2024-01-21T10-08-30.529941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564105927142331,\n \"acc_stderr\": 0.032011062761424514,\n \"acc_norm\": 0.6561944704621784,\n \"acc_norm_stderr\": 0.03267453714628516,\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6136319274258737,\n \"mc2_stderr\": 0.015253515428580656\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156217,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.686516630153356,\n \"acc_stderr\": 0.004629608863272306,\n \"acc_norm\": 0.8688508265285799,\n \"acc_norm_stderr\": 0.00336873543416138\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700472,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700472\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.016525425898773496,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.016525425898773496\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.017433490102538765,\n \"mc2\": 0.6136319274258737,\n \"mc2_stderr\": 0.015253515428580656\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510423\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7270659590598939,\n \"acc_stderr\": 0.012270381151108754\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Severus-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["**/details_harness|winogrande|5_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T10-08-30.529941.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T10_08_30.529941", "path": ["results_2024-01-21T10-08-30.529941.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T10-08-30.529941.parquet"]}]}]}
2024-01-21T10:11:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of FelixChao/Severus-7B Dataset automatically created during the evaluation run of model FelixChao/Severus-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T10:08:30.529941(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of FelixChao/Severus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Severus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:08:30.529941(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of FelixChao/Severus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Severus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:08:30.529941(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
60bc52b6e60d7e07a269a33f9b425756ec7dba3f
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T10:09:26.244538](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e2/blob/main/results_2024-01-21T10-09-26.244538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.607079139132246, "acc_stderr": 0.03319995227313054, "acc_norm": 0.6118874575204905, "acc_norm_stderr": 0.03387630522975043, "mc1": 0.48592411260709917, "mc1_stderr": 0.01749656371704279, "mc2": 0.6510185973494141, "mc2_stderr": 0.015253991104225291 }, "harness|arc:challenge|25": { "acc": 0.5597269624573379, "acc_stderr": 0.014506769524804232, "acc_norm": 0.606655290102389, "acc_norm_stderr": 0.014275101465693028 }, "harness|hellaswag|10": { "acc": 0.6428002389962159, "acc_stderr": 0.004781950883460502, "acc_norm": 0.8354909380601474, "acc_norm_stderr": 0.0036997919347543633 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.02914690474779833, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.02914690474779833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5722543352601156, "acc_stderr": 0.03772446857518026, "acc_norm": 0.5722543352601156, "acc_norm_stderr": 0.03772446857518026 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.03255525359340355, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.03255525359340355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594964, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594964 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.025167982333894143, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.025167982333894143 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7129032258064516, "acc_stderr": 0.02573654274559453, "acc_norm": 0.7129032258064516, "acc_norm_stderr": 0.02573654274559453 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072387, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072387 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.025069094387296532, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.025069094387296532 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135356, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135356 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.01738141556360868, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.01738141556360868 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4537037037037037, "acc_stderr": 0.03395322726375797, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.03395322726375797 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251742, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251742 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.02830465794303531, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.02830465794303531 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330313, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330313 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.038808483010823944, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507437, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507437 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.03487825168497892, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.768837803320562, "acc_stderr": 0.015075523238101083, "acc_norm": 0.768837803320562, "acc_norm_stderr": 0.015075523238101083 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647897, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.015801003729145894, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.015801003729145894 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6862745098039216, "acc_stderr": 0.026568921015457138, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.026568921015457138 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464482, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7037037037037037, "acc_stderr": 0.02540719779889017, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.02540719779889017 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.02975238965742705, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.02975238965742705 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.439374185136897, "acc_stderr": 0.012676014778580217, "acc_norm": 0.439374185136897, "acc_norm_stderr": 0.012676014778580217 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5882352941176471, "acc_stderr": 0.02989616303312547, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.02989616303312547 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.019722058939618068, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.019722058939618068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653693, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653693 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.48592411260709917, "mc1_stderr": 0.01749656371704279, "mc2": 0.6510185973494141, "mc2_stderr": 0.015253991104225291 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774087 }, "harness|gsm8k|5": { "acc": 0.39423805913570886, "acc_stderr": 0.013460852357095652 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e2
[ "region:us" ]
2024-01-21T10:11:47+00:00
{"pretty_name": "Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2", "dataset_summary": "Dataset automatically created during the evaluation run of model [genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2](https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T10:09:26.244538](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__Mistral-7b-instruct-v0.2-summ-sft-bf16-e2/blob/main/results_2024-01-21T10-09-26.244538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607079139132246,\n \"acc_stderr\": 0.03319995227313054,\n \"acc_norm\": 0.6118874575204905,\n \"acc_norm_stderr\": 0.03387630522975043,\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.01749656371704279,\n \"mc2\": 0.6510185973494141,\n \"mc2_stderr\": 0.015253991104225291\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804232,\n \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693028\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6428002389962159,\n \"acc_stderr\": 0.004781950883460502,\n \"acc_norm\": 0.8354909380601474,\n \"acc_norm_stderr\": 0.0036997919347543633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n \"acc_stderr\": 0.02573654274559453,\n \"acc_norm\": 0.7129032258064516,\n \"acc_norm_stderr\": 0.02573654274559453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251742,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251742\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330313,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330313\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101083,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101083\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.015801003729145894,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.015801003729145894\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457138,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889017,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889017\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580217,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580217\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48592411260709917,\n \"mc1_stderr\": 0.01749656371704279,\n \"mc2\": 0.6510185973494141,\n \"mc2_stderr\": 0.015253991104225291\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774087\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \"acc_stderr\": 0.013460852357095652\n }\n}\n```", "repo_url": "https://huggingface.co/genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-09-26.244538.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["**/details_harness|winogrande|5_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T10-09-26.244538.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T10_09_26.244538", "path": ["results_2024-01-21T10-09-26.244538.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T10-09-26.244538.parquet"]}]}]}
2024-01-21T10:12:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2 Dataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T10:09:26.244538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:09:26.244538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2\n\n\n\nDataset automatically created during the evaluation run of model genaicore3434/Mistral-7b-instruct-v0.2-summ-sft-bf16-e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:09:26.244538(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0d92eba6bb0a7359a6816ad20c9ccd02847786e5
# Dataset Card for Evaluation run of qblocks/mistral_7b_HalfEpoch_DolphinCoder <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [qblocks/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_HalfEpoch_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_qblocks__mistral_7b_HalfEpoch_DolphinCoder", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T10:15:02.334082](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-21T10-15-02.334082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6142255167007181, "acc_stderr": 0.03288830350165743, "acc_norm": 0.6209744380373297, "acc_norm_stderr": 0.0335648510230416, "mc1": 0.2998776009791922, "mc1_stderr": 0.016040352966713627, "mc2": 0.4546153705308199, "mc2_stderr": 0.01483288188604126 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.014409825518403077, "acc_norm": 0.6177474402730375, "acc_norm_stderr": 0.014200454049979274 }, "harness|hellaswag|10": { "acc": 0.6361282613025294, "acc_stderr": 0.004801290954387088, "acc_norm": 0.8226448914558853, "acc_norm_stderr": 0.003811883070911275 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.039531733777491945, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.039531733777491945 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.037143259063020656, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.037143259063020656 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02487081525105709, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02487081525105709 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7258064516129032, "acc_stderr": 0.025378139970885203, "acc_norm": 0.7258064516129032, "acc_norm_stderr": 0.025378139970885203 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218964, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218964 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6051282051282051, "acc_stderr": 0.0247843169421564, "acc_norm": 0.6051282051282051, "acc_norm_stderr": 0.0247843169421564 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857403, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857403 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6092436974789915, "acc_stderr": 0.031693802357129965, "acc_norm": 0.6092436974789915, "acc_norm_stderr": 0.031693802357129965 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8073394495412844, "acc_stderr": 0.016909276884936073, "acc_norm": 0.8073394495412844, "acc_norm_stderr": 0.016909276884936073 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7401960784313726, "acc_stderr": 0.03077855467869326, "acc_norm": 0.7401960784313726, "acc_norm_stderr": 0.03077855467869326 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.037683359597287434, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.037683359597287434 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163025, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.041858325989283136, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.041858325989283136 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7982120051085568, "acc_stderr": 0.014351702181636864, "acc_norm": 0.7982120051085568, "acc_norm_stderr": 0.014351702181636864 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7109826589595376, "acc_stderr": 0.02440517393578323, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.02440517393578323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.015949308790233645, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.015949308790233645 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399665, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399665 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.025329888171900926, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.025329888171900926 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6213235294117647, "acc_stderr": 0.02946513363977613, "acc_norm": 0.6213235294117647, "acc_norm_stderr": 0.02946513363977613 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6274509803921569, "acc_stderr": 0.019559646809215927, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.019559646809215927 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.04494290866252091, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.04494290866252091 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6571428571428571, "acc_stderr": 0.03038726291954773, "acc_norm": 0.6571428571428571, "acc_norm_stderr": 0.03038726291954773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.02768691358801302, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.02768691358801302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.034873508801977725, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977725 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.2998776009791922, "mc1_stderr": 0.016040352966713627, "mc2": 0.4546153705308199, "mc2_stderr": 0.01483288188604126 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 }, "harness|gsm8k|5": { "acc": 0.2964366944655042, "acc_stderr": 0.012579398235589534 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_qblocks__mistral_7b_HalfEpoch_DolphinCoder
[ "region:us" ]
2024-01-21T10:17:24+00:00
{"pretty_name": "Evaluation run of qblocks/mistral_7b_HalfEpoch_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [qblocks/mistral_7b_HalfEpoch_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_HalfEpoch_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__mistral_7b_HalfEpoch_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T10:15:02.334082](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_HalfEpoch_DolphinCoder/blob/main/results_2024-01-21T10-15-02.334082.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6142255167007181,\n \"acc_stderr\": 0.03288830350165743,\n \"acc_norm\": 0.6209744380373297,\n \"acc_norm_stderr\": 0.0335648510230416,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713627,\n \"mc2\": 0.4546153705308199,\n \"mc2_stderr\": 0.01483288188604126\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403077,\n \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6361282613025294,\n \"acc_stderr\": 0.004801290954387088,\n \"acc_norm\": 0.8226448914558853,\n \"acc_norm_stderr\": 0.003811883070911275\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.7258064516129032,\n \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218964,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218964\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636864,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636864\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954773,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713627,\n \"mc2\": 0.4546153705308199,\n \"mc2_stderr\": 0.01483288188604126\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2964366944655042,\n \"acc_stderr\": 0.012579398235589534\n }\n}\n```", "repo_url": "https://huggingface.co/qblocks/mistral_7b_HalfEpoch_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T10-15-02.334082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["**/details_harness|winogrande|5_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T10-15-02.334082.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T10_15_02.334082", "path": ["results_2024-01-21T10-15-02.334082.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T10-15-02.334082.parquet"]}]}]}
2024-01-21T10:17:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of qblocks/mistral_7b_HalfEpoch_DolphinCoder Dataset automatically created during the evaluation run of model qblocks/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T10:15:02.334082(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of qblocks/mistral_7b_HalfEpoch_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:15:02.334082(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of qblocks/mistral_7b_HalfEpoch_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/mistral_7b_HalfEpoch_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T10:15:02.334082(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7e8164d54170fa78dc36a386aa8c5f7931d5a139
# Hate and Offensive Speech Dataset <!-- Provide a quick summary of the dataset. --> This dataset was created using several datasets that can be found on Hugging Face: -**SetFit/hate_speech_offensive:https://huggingface.co/datasets/SetFit/hate_speech_offensive** -**tweets_hate_speech_detection:https://huggingface.co/datasets/tweets_hate_speech_detection** -**thefrankhsu/hate_speech_twitter:https://huggingface.co/datasets/thefrankhsu/hate_speech_twitter** -**parnoux/hate_speech_open_data_original_class_test_set:https://huggingface.co/datasets/parnoux/hate_speech_open_data_original_class_test_set** ## Dataset Details The dataset contains columns: -**tweets_cleaned - that stores tweets, cleaned to some extent, may need additional work for some tasks** -**label - data labels, where 0 means hate, 1 means offensive speech, 2 means neither**
MartynaKopyta/hate_offensive_tweets
[ "task_categories:text-classification", "task_categories:token-classification", "language:en", "license:mit", "region:us" ]
2024-01-21T10:30:25+00:00
{"language": ["en"], "license": "mit", "task_categories": ["text-classification", "token-classification"]}
2024-01-21T13:02:59+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #task_categories-token-classification #language-English #license-mit #region-us
# Hate and Offensive Speech Dataset This dataset was created using several datasets that can be found on Hugging Face: -SetFit/hate_speech_offensive:URL -tweets_hate_speech_detection:URL -thefrankhsu/hate_speech_twitter:URL -parnoux/hate_speech_open_data_original_class_test_set:URL ## Dataset Details The dataset contains columns: -tweets_cleaned - that stores tweets, cleaned to some extent, may need additional work for some tasks -label - data labels, where 0 means hate, 1 means offensive speech, 2 means neither
[ "# Hate and Offensive Speech Dataset\n\n\n\nThis dataset was created using several datasets that can be found on Hugging Face: \n\n-SetFit/hate_speech_offensive:URL\n-tweets_hate_speech_detection:URL\n-thefrankhsu/hate_speech_twitter:URL\n-parnoux/hate_speech_open_data_original_class_test_set:URL", "## Dataset Details\n\nThe dataset contains columns:\n\n-tweets_cleaned - that stores tweets, cleaned to some extent, may need additional work for some tasks\n\n-label - data labels, where 0 means hate, 1 means offensive speech, 2 means neither" ]
[ "TAGS\n#task_categories-text-classification #task_categories-token-classification #language-English #license-mit #region-us \n", "# Hate and Offensive Speech Dataset\n\n\n\nThis dataset was created using several datasets that can be found on Hugging Face: \n\n-SetFit/hate_speech_offensive:URL\n-tweets_hate_speech_detection:URL\n-thefrankhsu/hate_speech_twitter:URL\n-parnoux/hate_speech_open_data_original_class_test_set:URL", "## Dataset Details\n\nThe dataset contains columns:\n\n-tweets_cleaned - that stores tweets, cleaned to some extent, may need additional work for some tasks\n\n-label - data labels, where 0 means hate, 1 means offensive speech, 2 means neither" ]
d189f17bcf7c7f554f48eac0c161ec3218a7a2b8
# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yunconglong__Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T11:15:19.870178](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B/blob/main/results_2024-01-21T11-15-19.870178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6537295532650617, "acc_stderr": 0.0320842334613103, "acc_norm": 0.6525780615840782, "acc_norm_stderr": 0.03277229116935712, "mc1": 0.627906976744186, "mc1_stderr": 0.01692109011881403, "mc2": 0.780160272588061, "mc2_stderr": 0.013871089730066658 }, "harness|arc:challenge|25": { "acc": 0.7209897610921502, "acc_stderr": 0.013106784883601334, "acc_norm": 0.7491467576791809, "acc_norm_stderr": 0.012668198621315425 }, "harness|hellaswag|10": { "acc": 0.7189802828121888, "acc_stderr": 0.004485784468576664, "acc_norm": 0.8930491933877713, "acc_norm_stderr": 0.0030841908180933076 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04444444444444449, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7967741935483871, "acc_stderr": 0.022891687984554963, "acc_norm": 0.7967741935483871, "acc_norm_stderr": 0.022891687984554963 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.02985751567338641, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.02985751567338641 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131147, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131147 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243839, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243839 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4491620111731844, "acc_stderr": 0.01663583834163192, "acc_norm": 0.4491620111731844, "acc_norm_stderr": 0.01663583834163192 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667874, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083136, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083136 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396553, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396553 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.0189754279205072, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.0189754279205072 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.627906976744186, "mc1_stderr": 0.01692109011881403, "mc2": 0.780160272588061, "mc2_stderr": 0.013871089730066658 }, "harness|winogrande|5": { "acc": 0.8823993685872139, "acc_stderr": 0.009053584685573185 }, "harness|gsm8k|5": { "acc": 0.6952236542835482, "acc_stderr": 0.012679297549515437 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_yunconglong__Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
[ "region:us" ]
2024-01-21T11:17:35+00:00
{"pretty_name": "Evaluation run of yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T11:15:19.870178](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B/blob/main/results_2024-01-21T11-15-19.870178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537295532650617,\n \"acc_stderr\": 0.0320842334613103,\n \"acc_norm\": 0.6525780615840782,\n \"acc_norm_stderr\": 0.03277229116935712,\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.780160272588061,\n \"mc2_stderr\": 0.013871089730066658\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7209897610921502,\n \"acc_stderr\": 0.013106784883601334,\n \"acc_norm\": 0.7491467576791809,\n \"acc_norm_stderr\": 0.012668198621315425\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7189802828121888,\n \"acc_stderr\": 0.004485784468576664,\n \"acc_norm\": 0.8930491933877713,\n \"acc_norm_stderr\": 0.0030841908180933076\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.780160272588061,\n \"mc2_stderr\": 0.013871089730066658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8823993685872139,\n \"acc_stderr\": 0.009053584685573185\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515437\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-15-19.870178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["**/details_harness|winogrande|5_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T11-15-19.870178.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T11_15_19.870178", "path": ["results_2024-01-21T11-15-19.870178.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T11-15-19.870178.parquet"]}]}]}
2024-01-21T11:17:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B Dataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T11:15:19.870178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:15:19.870178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:15:19.870178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
468f924299da32703a44afcbd764b475e3b264b5
# Dataset Card for Evaluation run of senseable/Westlake-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [senseable/Westlake-7B](https://huggingface.co/senseable/Westlake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_senseable__Westlake-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T11:21:00.496519](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Westlake-7B/blob/main/results_2024-01-21T11-21-00.496519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6525383312722257, "acc_stderr": 0.03220255499978126, "acc_norm": 0.6519411952580282, "acc_norm_stderr": 0.03288129965668576, "mc1": 0.5336597307221542, "mc1_stderr": 0.0174637938671681, "mc2": 0.673627520150055, "mc2_stderr": 0.015362156789879754 }, "harness|arc:challenge|25": { "acc": 0.7047781569965871, "acc_stderr": 0.01332975029338232, "acc_norm": 0.7320819112627986, "acc_norm_stderr": 0.012942030195136445 }, "harness|hellaswag|10": { "acc": 0.7204740091615216, "acc_stderr": 0.004478491697891231, "acc_norm": 0.884883489344752, "acc_norm_stderr": 0.003185102191687907 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700914, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700914 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.025542846817400492, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.025542846817400492 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.024362599693031093, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.024362599693031093 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919436, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.024121125416941197, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.024121125416941197 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886786, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976037, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976037 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.025524722324553346, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.025524722324553346 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572213, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572213 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128137, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8352490421455939, "acc_stderr": 0.0132653462613238, "acc_norm": 0.8352490421455939, "acc_norm_stderr": 0.0132653462613238 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4301675977653631, "acc_stderr": 0.01655860163604103, "acc_norm": 0.4301675977653631, "acc_norm_stderr": 0.01655860163604103 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.02609016250427905, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998482, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.02858270975389845, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.02858270975389845 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.019117213911495144, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.019117213911495144 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5336597307221542, "mc1_stderr": 0.0174637938671681, "mc2": 0.673627520150055, "mc2_stderr": 0.015362156789879754 }, "harness|winogrande|5": { "acc": 0.8602999210734017, "acc_stderr": 0.009743307618298174 }, "harness|gsm8k|5": { "acc": 0.6717210007581501, "acc_stderr": 0.012934758019449613 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_senseable__Westlake-7B
[ "region:us" ]
2024-01-21T11:23:19+00:00
{"pretty_name": "Evaluation run of senseable/Westlake-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [senseable/Westlake-7B](https://huggingface.co/senseable/Westlake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_senseable__Westlake-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T11:21:00.496519](https://huggingface.co/datasets/open-llm-leaderboard/details_senseable__Westlake-7B/blob/main/results_2024-01-21T11-21-00.496519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6525383312722257,\n \"acc_stderr\": 0.03220255499978126,\n \"acc_norm\": 0.6519411952580282,\n \"acc_norm_stderr\": 0.03288129965668576,\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.0174637938671681,\n \"mc2\": 0.673627520150055,\n \"mc2_stderr\": 0.015362156789879754\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.01332975029338232,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136445\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7204740091615216,\n \"acc_stderr\": 0.004478491697891231,\n \"acc_norm\": 0.884883489344752,\n \"acc_norm_stderr\": 0.003185102191687907\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400492,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400492\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.0132653462613238,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.0132653462613238\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604103,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.0174637938671681,\n \"mc2\": 0.673627520150055,\n \"mc2_stderr\": 0.015362156789879754\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8602999210734017,\n \"acc_stderr\": 0.009743307618298174\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6717210007581501,\n \"acc_stderr\": 0.012934758019449613\n }\n}\n```", "repo_url": "https://huggingface.co/senseable/Westlake-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-21-00.496519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["**/details_harness|winogrande|5_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T11-21-00.496519.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T11_21_00.496519", "path": ["results_2024-01-21T11-21-00.496519.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T11-21-00.496519.parquet"]}]}]}
2024-01-21T11:23:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of senseable/Westlake-7B Dataset automatically created during the evaluation run of model senseable/Westlake-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T11:21:00.496519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of senseable/Westlake-7B\n\n\n\nDataset automatically created during the evaluation run of model senseable/Westlake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:21:00.496519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of senseable/Westlake-7B\n\n\n\nDataset automatically created during the evaluation run of model senseable/Westlake-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:21:00.496519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0b09674b2ae9579eba0a28c60d569b6b0bf96c93
Mini-Laion is a subset of Laion-400M dataset
Prasant/Mini-Laion
[ "license:apache-2.0", "region:us" ]
2024-01-21T11:31:14+00:00
{"license": "apache-2.0"}
2024-01-21T18:20:03+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Mini-Laion is a subset of Laion-400M dataset
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
3bfc1af4f07f99b68a92610b0ce41f186c93dcae
# Dataset Card for Evaluation run of freeCS-dot-org/Zero-7B-test-3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [freeCS-dot-org/Zero-7B-test-3](https://huggingface.co/freeCS-dot-org/Zero-7B-test-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T11:45:44.746324](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3/blob/main/results_2024-01-21T11-45-44.746324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5351168403231941, "acc_stderr": 0.03393847969564533, "acc_norm": 0.5411522378835072, "acc_norm_stderr": 0.034676938364257434, "mc1": 0.41615667074663404, "mc1_stderr": 0.017255657502903043, "mc2": 0.5830418943671974, "mc2_stderr": 0.015473737248335182 }, "harness|arc:challenge|25": { "acc": 0.5887372013651877, "acc_stderr": 0.014379441068522084, "acc_norm": 0.6424914675767918, "acc_norm_stderr": 0.014005494275916573 }, "harness|hellaswag|10": { "acc": 0.5992830113523202, "acc_stderr": 0.004890422457747264, "acc_norm": 0.798546106353316, "acc_norm_stderr": 0.004002665957282743 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5259259259259259, "acc_stderr": 0.04313531696750574, "acc_norm": 0.5259259259259259, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.04026097083296564, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.04026097083296564 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6113207547169811, "acc_stderr": 0.030000485448675986, "acc_norm": 0.6113207547169811, "acc_norm_stderr": 0.030000485448675986 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887248, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.032619369184673806, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.032619369184673806 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.046151869625837026, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.046151869625837026 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.041546596717075474, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.024833839825562417, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.024833839825562417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.040735243221471255, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.040735243221471255 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6645161290322581, "acc_stderr": 0.02686020644472435, "acc_norm": 0.6645161290322581, "acc_norm_stderr": 0.02686020644472435 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3694581280788177, "acc_stderr": 0.03395970381998574, "acc_norm": 0.3694581280788177, "acc_norm_stderr": 0.03395970381998574 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.45454545454545453, "acc_stderr": 0.038881769216741, "acc_norm": 0.45454545454545453, "acc_norm_stderr": 0.038881769216741 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6565656565656566, "acc_stderr": 0.03383201223244441, "acc_norm": 0.6565656565656566, "acc_norm_stderr": 0.03383201223244441 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.02897908979429673, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.025158266016868575, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.025158266016868575 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5588235294117647, "acc_stderr": 0.0322529423239964, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.0322529423239964 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7504587155963303, "acc_stderr": 0.018553897629501624, "acc_norm": 0.7504587155963303, "acc_norm_stderr": 0.018553897629501624 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3611111111111111, "acc_stderr": 0.032757734861009996, "acc_norm": 0.3611111111111111, "acc_norm_stderr": 0.032757734861009996 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.4852941176470588, "acc_stderr": 0.03507793834791325, "acc_norm": 0.4852941176470588, "acc_norm_stderr": 0.03507793834791325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.030165137867847008, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.030165137867847008 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.032443052830087304, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.032443052830087304 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.0426073515764456, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.0426073515764456 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709696, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709696 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.04616631111801713, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.04616631111801713 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6319018404907976, "acc_stderr": 0.03789213935838396, "acc_norm": 0.6319018404907976, "acc_norm_stderr": 0.03789213935838396 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.02624677294689048, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.02624677294689048 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562429, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562429 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7458492975734355, "acc_stderr": 0.015569254692045755, "acc_norm": 0.7458492975734355, "acc_norm_stderr": 0.015569254692045755 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5982658959537572, "acc_stderr": 0.026394104177643634, "acc_norm": 0.5982658959537572, "acc_norm_stderr": 0.026394104177643634 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.01426555419233115, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.01426555419233115 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5555555555555556, "acc_stderr": 0.02845263998508801, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.02845263998508801 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.594855305466238, "acc_stderr": 0.027882383791325953, "acc_norm": 0.594855305466238, "acc_norm_stderr": 0.027882383791325953 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5987654320987654, "acc_stderr": 0.027272582849839792, "acc_norm": 0.5987654320987654, "acc_norm_stderr": 0.027272582849839792 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.39361702127659576, "acc_stderr": 0.029144544781596154, "acc_norm": 0.39361702127659576, "acc_norm_stderr": 0.029144544781596154 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.37157757496740546, "acc_stderr": 0.01234182851452829, "acc_norm": 0.37157757496740546, "acc_norm_stderr": 0.01234182851452829 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.44485294117647056, "acc_stderr": 0.030187532060329394, "acc_norm": 0.44485294117647056, "acc_norm_stderr": 0.030187532060329394 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5261437908496732, "acc_stderr": 0.020200164564804588, "acc_norm": 0.5261437908496732, "acc_norm_stderr": 0.020200164564804588 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5795918367346938, "acc_stderr": 0.03160106993449601, "acc_norm": 0.5795918367346938, "acc_norm_stderr": 0.03160106993449601 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.03251006816458618, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.03251006816458618 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699121, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.41615667074663404, "mc1_stderr": 0.017255657502903043, "mc2": 0.5830418943671974, "mc2_stderr": 0.015473737248335182 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207397 }, "harness|gsm8k|5": { "acc": 0.20394238059135708, "acc_stderr": 0.011098602284899175 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3
[ "region:us" ]
2024-01-21T11:48:02+00:00
{"pretty_name": "Evaluation run of freeCS-dot-org/Zero-7B-test-3", "dataset_summary": "Dataset automatically created during the evaluation run of model [freeCS-dot-org/Zero-7B-test-3](https://huggingface.co/freeCS-dot-org/Zero-7B-test-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T11:45:44.746324](https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3/blob/main/results_2024-01-21T11-45-44.746324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5351168403231941,\n \"acc_stderr\": 0.03393847969564533,\n \"acc_norm\": 0.5411522378835072,\n \"acc_norm_stderr\": 0.034676938364257434,\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5830418943671974,\n \"mc2_stderr\": 0.015473737248335182\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5992830113523202,\n \"acc_stderr\": 0.004890422457747264,\n \"acc_norm\": 0.798546106353316,\n \"acc_norm_stderr\": 0.004002665957282743\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.032619369184673806,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.032619369184673806\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.038881769216741,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.038881769216741\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868575,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868575\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501624,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791325,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847008,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847008\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562429,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562429\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n \"acc_stderr\": 0.015569254692045755,\n \"acc_norm\": 0.7458492975734355,\n \"acc_norm_stderr\": 0.015569254692045755\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.01426555419233115,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.01426555419233115\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839792,\n \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839792\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37157757496740546,\n \"acc_stderr\": 0.01234182851452829,\n \"acc_norm\": 0.37157757496740546,\n \"acc_norm_stderr\": 0.01234182851452829\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329394,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329394\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.020200164564804588,\n \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.020200164564804588\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5830418943671974,\n \"mc2_stderr\": 0.015473737248335182\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207397\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20394238059135708,\n \"acc_stderr\": 0.011098602284899175\n }\n}\n```", "repo_url": "https://huggingface.co/freeCS-dot-org/Zero-7B-test-3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T11-45-44.746324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["**/details_harness|winogrande|5_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T11-45-44.746324.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T11_45_44.746324", "path": ["results_2024-01-21T11-45-44.746324.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T11-45-44.746324.parquet"]}]}]}
2024-01-21T11:48:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of freeCS-dot-org/Zero-7B-test-3 Dataset automatically created during the evaluation run of model freeCS-dot-org/Zero-7B-test-3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T11:45:44.746324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of freeCS-dot-org/Zero-7B-test-3\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/Zero-7B-test-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:45:44.746324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of freeCS-dot-org/Zero-7B-test-3\n\n\n\nDataset automatically created during the evaluation run of model freeCS-dot-org/Zero-7B-test-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T11:45:44.746324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5a0169051e3c82d2eee8749db19caa9fc2a809c0
# Dataset Card for Evaluation run of Vasanth/Valor_Macaroni_moe <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Vasanth/Valor_Macaroni_moe](https://huggingface.co/Vasanth/Valor_Macaroni_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T12:08:54.379956](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe/blob/main/results_2024-01-21T12-08-54.379956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6507794825269755, "acc_stderr": 0.03207408539778976, "acc_norm": 0.650451847875648, "acc_norm_stderr": 0.032740448600963236, "mc1": 0.4920440636474908, "mc1_stderr": 0.01750128507455183, "mc2": 0.6464987959765103, "mc2_stderr": 0.015375185619864559 }, "harness|arc:challenge|25": { "acc": 0.6757679180887372, "acc_stderr": 0.013678810399518822, "acc_norm": 0.7030716723549488, "acc_norm_stderr": 0.013352025976725223 }, "harness|hellaswag|10": { "acc": 0.6838279227245568, "acc_stderr": 0.004640306719628063, "acc_norm": 0.8661621190997809, "acc_norm_stderr": 0.003397822089857292 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.02794321998933714, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.02794321998933714 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.035506839891655796, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.035506839891655796 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.02550648169813821, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.02550648169813821 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328972, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328972 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.03068473711513536, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.03068473711513536 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.02655837250266192, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.02655837250266192 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.02344582627654554, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.02344582627654554 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.025403832978179604, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.025403832978179604 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7345679012345679, "acc_stderr": 0.024569223600460845, "acc_norm": 0.7345679012345679, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46479791395045633, "acc_stderr": 0.012738547371303954, "acc_norm": 0.46479791395045633, "acc_norm_stderr": 0.012738547371303954 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898445, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898445 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.019094228167000328, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.019094228167000328 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.02783302387139967, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.02783302387139967 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.4920440636474908, "mc1_stderr": 0.01750128507455183, "mc2": 0.6464987959765103, "mc2_stderr": 0.015375185619864559 }, "harness|winogrande|5": { "acc": 0.8224151539068666, "acc_stderr": 0.010740676861359244 }, "harness|gsm8k|5": { "acc": 0.7081122062168309, "acc_stderr": 0.012522795894420869 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe
[ "region:us" ]
2024-01-21T12:11:07+00:00
{"pretty_name": "Evaluation run of Vasanth/Valor_Macaroni_moe", "dataset_summary": "Dataset automatically created during the evaluation run of model [Vasanth/Valor_Macaroni_moe](https://huggingface.co/Vasanth/Valor_Macaroni_moe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T12:08:54.379956](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe/blob/main/results_2024-01-21T12-08-54.379956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6507794825269755,\n \"acc_stderr\": 0.03207408539778976,\n \"acc_norm\": 0.650451847875648,\n \"acc_norm_stderr\": 0.032740448600963236,\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6464987959765103,\n \"mc2_stderr\": 0.015375185619864559\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725223\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6838279227245568,\n \"acc_stderr\": 0.004640306719628063,\n \"acc_norm\": 0.8661621190997809,\n \"acc_norm_stderr\": 0.003397822089857292\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303954,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303954\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139967,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139967\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n \"mc1_stderr\": 0.01750128507455183,\n \"mc2\": 0.6464987959765103,\n \"mc2_stderr\": 0.015375185619864559\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359244\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \"acc_stderr\": 0.012522795894420869\n }\n}\n```", "repo_url": "https://huggingface.co/Vasanth/Valor_Macaroni_moe", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["**/details_harness|winogrande|5_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T12-08-54.379956.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T12_08_54.379956", "path": ["results_2024-01-21T12-08-54.379956.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T12-08-54.379956.parquet"]}]}]}
2024-01-21T12:11:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Vasanth/Valor_Macaroni_moe Dataset automatically created during the evaluation run of model Vasanth/Valor_Macaroni_moe on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T12:08:54.379956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Vasanth/Valor_Macaroni_moe\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Valor_Macaroni_moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:08:54.379956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Vasanth/Valor_Macaroni_moe\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Valor_Macaroni_moe on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:08:54.379956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7d72604d11b63d90a63f53a3b579d2963f5f8d38
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-bf16-e3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T12:17:50.291881](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-bf16-e3/blob/main/results_2024-01-21T12-17-50.291881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6072497690805192, "acc_stderr": 0.03321786740565002, "acc_norm": 0.6120138406900769, "acc_norm_stderr": 0.03389600513911076, "mc1": 0.4908200734394125, "mc1_stderr": 0.01750055072481975, "mc2": 0.6521720076920506, "mc2_stderr": 0.015261907117760843 }, "harness|arc:challenge|25": { "acc": 0.5588737201365188, "acc_stderr": 0.014509747749064663, "acc_norm": 0.6032423208191127, "acc_norm_stderr": 0.014296513020180647 }, "harness|hellaswag|10": { "acc": 0.6443935471021709, "acc_stderr": 0.004777183508949808, "acc_norm": 0.8367855008962358, "acc_norm_stderr": 0.0036880598312390186 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5722543352601156, "acc_stderr": 0.03772446857518026, "acc_norm": 0.5722543352601156, "acc_norm_stderr": 0.03772446857518026 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.032529096196131965, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3862433862433862, "acc_stderr": 0.025075981767601684, "acc_norm": 0.3862433862433862, "acc_norm_stderr": 0.025075981767601684 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7032258064516129, "acc_stderr": 0.025988500792411894, "acc_norm": 0.7032258064516129, "acc_norm_stderr": 0.025988500792411894 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072387, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072387 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5794871794871795, "acc_stderr": 0.025028610276710862, "acc_norm": 0.5794871794871795, "acc_norm_stderr": 0.025028610276710862 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.01726674208763079, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.01726674208763079 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4537037037037037, "acc_stderr": 0.03395322726375797, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.03395322726375797 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251742, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251742 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330313, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330313 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.038808483010823944, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.04373313040914761, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.04373313040914761 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905713, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905713 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6763005780346821, "acc_stderr": 0.025190181327608408, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.025190181327608408 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33631284916201115, "acc_stderr": 0.015801003729145897, "acc_norm": 0.33631284916201115, "acc_norm_stderr": 0.015801003729145897 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.026643278474508755, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.026643278474508755 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464482, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464482 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.02548311560119546, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.02548311560119546 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.012685906538206244, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.012685906538206244 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6127450980392157, "acc_stderr": 0.019706875804085644, "acc_norm": 0.6127450980392157, "acc_norm_stderr": 0.019706875804085644 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.0289205832206756, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.0289205832206756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7761194029850746, "acc_stderr": 0.029475250236017204, "acc_norm": 0.7761194029850746, "acc_norm_stderr": 0.029475250236017204 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653693, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653693 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.4908200734394125, "mc1_stderr": 0.01750055072481975, "mc2": 0.6521720076920506, "mc2_stderr": 0.015261907117760843 }, "harness|winogrande|5": { "acc": 0.7782162588792423, "acc_stderr": 0.011676109244497813 }, "harness|gsm8k|5": { "acc": 0.39196360879454134, "acc_stderr": 0.013447140886023824 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-bf16-e3
[ "region:us" ]
2024-01-21T12:14:05+00:00
{"pretty_name": "Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3", "dataset_summary": "Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-bf16-e3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T12:17:50.291881](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-sft-bf16-e3/blob/main/results_2024-01-21T12-17-50.291881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6072497690805192,\n \"acc_stderr\": 0.03321786740565002,\n \"acc_norm\": 0.6120138406900769,\n \"acc_norm_stderr\": 0.03389600513911076,\n \"mc1\": 0.4908200734394125,\n \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6521720076920506,\n \"mc2_stderr\": 0.015261907117760843\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180647\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6443935471021709,\n \"acc_stderr\": 0.004777183508949808,\n \"acc_norm\": 0.8367855008962358,\n \"acc_norm_stderr\": 0.0036880598312390186\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251742,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251742\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330313,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330313\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.015104550008905713,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.015104550008905713\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.015801003729145897,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.015801003729145897\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085644,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4908200734394125,\n \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6521720076920506,\n \"mc2_stderr\": 0.015261907117760843\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39196360879454134,\n \"acc_stderr\": 0.013447140886023824\n }\n}\n```", "repo_url": "https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-11-48.144092.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-50.291881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["**/details_harness|winogrande|5_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["**/details_harness|winogrande|5_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T12-17-50.291881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T12_11_48.144092", "path": ["results_2024-01-21T12-11-48.144092.parquet"]}, {"split": "2024_01_21T12_17_50.291881", "path": ["results_2024-01-21T12-17-50.291881.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T12-17-50.291881.parquet"]}]}]}
2024-01-21T12:20:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3 Dataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T12:17:50.291881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:17:50.291881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3\n\n\n\nDataset automatically created during the evaluation run of model silvercoder45/Mistral-7b-instruct-v0.2-summ-sft-bf16-e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:17:50.291881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
58cfdcdfad57c527e0c2013394fb0edbbecb9274
# Dataset Card for Evaluation run of binbi/SF-72B-V1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1](https://huggingface.co/binbi/SF-72B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_binbi__SF-72B-V1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T12:28:43.484005](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1/blob/main/results_2024-01-21T12-28-43.484005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2312183583064229, "acc_stderr": 0.029963667974972664, "acc_norm": 0.2311618522242625, "acc_norm_stderr": 0.030751973434955327, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731603, "mc2": 0.4877798130299791, "mc2_stderr": 0.016318959342538 }, "harness|arc:challenge|25": { "acc": 0.2235494880546075, "acc_stderr": 0.012174896631202605, "acc_norm": 0.2627986348122867, "acc_norm_stderr": 0.012862523175351333 }, "harness|hellaswag|10": { "acc": 0.25801633140808605, "acc_stderr": 0.004366488167386393, "acc_norm": 0.24865564628560047, "acc_norm_stderr": 0.004313503876346078 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.23018867924528302, "acc_stderr": 0.02590789712240817, "acc_norm": 0.23018867924528302, "acc_norm_stderr": 0.02590789712240817 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.03970158273235172, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.03970158273235172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371376, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371376 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604243, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604243 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.25112107623318386, "acc_stderr": 0.02910522083322462, "acc_norm": 0.25112107623318386, "acc_norm_stderr": 0.02910522083322462 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847836, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.22330097087378642, "acc_stderr": 0.04123553189891431, "acc_norm": 0.22330097087378642, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.21711366538952745, "acc_stderr": 0.014743125394823295, "acc_norm": 0.21711366538952745, "acc_norm_stderr": 0.014743125394823295 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25163398692810457, "acc_stderr": 0.01755581809132226, "acc_norm": 0.25163398692810457, "acc_norm_stderr": 0.01755581809132226 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.23391812865497075, "acc_stderr": 0.032467217651178264, "acc_norm": 0.23391812865497075, "acc_norm_stderr": 0.032467217651178264 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731603, "mc2": 0.4877798130299791, "mc2_stderr": 0.016318959342538 }, "harness|winogrande|5": { "acc": 0.4956590370955012, "acc_stderr": 0.014051956064076906 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_binbi__SF-72B-V1
[ "region:us" ]
2024-01-21T12:30:55+00:00
{"pretty_name": "Evaluation run of binbi/SF-72B-V1", "dataset_summary": "Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1](https://huggingface.co/binbi/SF-72B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_binbi__SF-72B-V1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T12:28:43.484005](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1/blob/main/results_2024-01-21T12-28-43.484005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2312183583064229,\n \"acc_stderr\": 0.029963667974972664,\n \"acc_norm\": 0.2311618522242625,\n \"acc_norm_stderr\": 0.030751973434955327,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n \"mc2_stderr\": 0.016318959342538\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202605,\n \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n \"acc_stderr\": 0.004366488167386393,\n \"acc_norm\": 0.24865564628560047,\n \"acc_norm_stderr\": 0.004313503876346078\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371376,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371376\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.25112107623318386,\n \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21711366538952745,\n \"acc_stderr\": 0.014743125394823295,\n \"acc_norm\": 0.21711366538952745,\n \"acc_norm_stderr\": 0.014743125394823295\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n \"mc2_stderr\": 0.016318959342538\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/binbi/SF-72B-V1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["**/details_harness|winogrande|5_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T12-28-43.484005.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T12_28_43.484005", "path": ["results_2024-01-21T12-28-43.484005.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T12-28-43.484005.parquet"]}]}]}
2024-01-21T12:31:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of binbi/SF-72B-V1 Dataset automatically created during the evaluation run of model binbi/SF-72B-V1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T12:28:43.484005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of binbi/SF-72B-V1\n\n\n\nDataset automatically created during the evaluation run of model binbi/SF-72B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:28:43.484005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of binbi/SF-72B-V1\n\n\n\nDataset automatically created during the evaluation run of model binbi/SF-72B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T12:28:43.484005(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bde3dae4fb30ed32d54538309362833bcfe3228f
# Dataset of narumeia/ナルメア (Granblue Fantasy) This is the dataset of narumeia/ナルメア (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `horns, long_hair, pointy_ears, hair_over_one_eye, breasts, large_breasts, blue_eyes, hair_ornament, light_purple_hair, braid, very_long_hair, pink_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 949.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumeia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 467.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumeia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1324 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/narumeia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 806.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumeia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1324 | 1.62 GiB | [Download](https://huggingface.co/datasets/CyberHarem/narumeia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/narumeia_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, belt, black_gloves, draph, elbow_gloves, katana, looking_at_viewer, sleeveless, solo, thigh_strap, uneven_gloves, butterfly, fingerless_gloves, holding_sword, single_braid, single_thighhigh, bare_shoulders, black_thighhighs, sheath, white_background, cleavage, demon_horns | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, blush, draph, looking_at_viewer, single_thighhigh, smile, solo, bare_shoulders, black_thighhighs, elbow_gloves, single_braid, thigh_strap, white_background, fingerless_gloves, simple_background, uneven_gloves, open_mouth, sleeveless, sitting, white_vest | | 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, draph, elbow_gloves, simple_background, solo, blush, sleeveless, looking_at_viewer, white_background, fingerless_gloves, open_mouth, single_braid, bare_shoulders, upper_body, :d, heart | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_gloves, draph, katana, solo, elbow_gloves, holding_sword, looking_at_viewer, single_braid, sleeveless, sheath, bare_shoulders, butterfly, fingerless_gloves | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, black_gloves, draph, fur_trim, holly, looking_at_viewer, official_alternate_costume, open_mouth, solo, upper_body, blush, christmas, simple_background, smile, white_background, hand_on_own_chest, pom_pom_(clothes), cleavage, sash, single_braid | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, crown_braid, detached_sleeves, draph, fur_trim, holly, looking_at_viewer, official_alternate_costume, pom_pom_(clothes), smile, solo, christmas, cleavage, closed_mouth, upper_body, blush, sash, single_braid | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, blush, draph, looking_at_viewer, official_alternate_costume, smile, solo, black_gloves, christmas, cleavage, fur_trim, garter_straps, thighhighs, open_mouth, holly, single_braid, bow, detached_sleeves, dress, heart-shaped_pupils | | 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, cleavage, draph, looking_at_viewer, official_alternate_costume, solo, white_bikini, earrings, bare_shoulders, collarbone, blush, double_bun, navel, thigh_strap, smile, side-tie_bikini_bottom, thighs, white_background | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, cleavage_cutout, draph, heart_hair_ornament, looking_at_viewer, official_alternate_costume, solo, hair_bow, heart_cutout, heart-shaped_pupils, blush, crown_braid, frilled_apron, long_sleeves, purple_bow, purple_hair, smile, white_background, black_sweater, chocolate, open_mouth, ribbed_sweater, simple_background, thighs, white_apron | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, draph, looking_at_viewer, solo, ponytail, smile, yukata, blush, hair_flower, cleavage, holding, obi, paper_fan, purple_hair | | 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, 1girl, draph, hetero, penis, solo_focus, vaginal, blush, cum_in_pussy, navel, nipples, completely_nude, looking_at_viewer, open_mouth, spread_legs, bangs, uncensored, girl_on_top, huge_breasts, pov, sex_from_behind, sweat, testicles, clitoris, collarbone, mosaic_censoring, purple_hair, single_braid, straddling, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | belt | black_gloves | draph | elbow_gloves | katana | looking_at_viewer | sleeveless | solo | thigh_strap | uneven_gloves | butterfly | fingerless_gloves | holding_sword | single_braid | single_thighhigh | bare_shoulders | black_thighhighs | sheath | white_background | cleavage | demon_horns | blush | smile | simple_background | open_mouth | sitting | white_vest | upper_body | :d | heart | fur_trim | holly | official_alternate_costume | christmas | hand_on_own_chest | pom_pom_(clothes) | sash | crown_braid | detached_sleeves | closed_mouth | garter_straps | thighhighs | bow | dress | heart-shaped_pupils | white_bikini | earrings | collarbone | double_bun | navel | side-tie_bikini_bottom | thighs | cleavage_cutout | heart_hair_ornament | hair_bow | heart_cutout | frilled_apron | long_sleeves | purple_bow | purple_hair | black_sweater | chocolate | ribbed_sweater | white_apron | ponytail | yukata | hair_flower | holding | obi | paper_fan | 1boy | hetero | penis | solo_focus | vaginal | cum_in_pussy | nipples | completely_nude | spread_legs | bangs | uncensored | girl_on_top | huge_breasts | pov | sex_from_behind | sweat | testicles | clitoris | mosaic_censoring | straddling | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:---------------|:--------|:---------------|:---------|:--------------------|:-------------|:-------|:--------------|:----------------|:------------|:--------------------|:----------------|:---------------|:-------------------|:-----------------|:-------------------|:---------|:-------------------|:-----------|:--------------|:--------|:--------|:--------------------|:-------------|:----------|:-------------|:-------------|:-----|:--------|:-----------|:--------|:-----------------------------|:------------|:--------------------|:--------------------|:-------|:--------------|:-------------------|:---------------|:----------------|:-------------|:------|:--------|:----------------------|:---------------|:-----------|:-------------|:-------------|:--------|:-------------------------|:---------|:------------------|:----------------------|:-----------|:---------------|:----------------|:---------------|:-------------|:--------------|:----------------|:------------|:-----------------|:--------------|:-----------|:---------|:--------------|:----------|:------|:------------|:-------|:---------|:--------|:-------------|:----------|:---------------|:----------|:------------------|:--------------|:--------|:-------------|:--------------|:---------------|:------|:------------------|:--------|:------------|:-----------|:-------------------|:-------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | | X | X | X | X | X | | X | | X | X | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | | X | X | X | | | | X | | X | | X | | | X | | | X | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | X | X | X | X | | | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | | | X | | X | | | | | | X | | X | | | X | X | | X | X | X | X | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | | X | | X | | | | | | X | | X | | | | X | | X | X | | | | | X | | | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | | | X | | X | | | | | | X | | X | | | | X | | X | X | | X | | | | | | X | X | X | X | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | | | X | | X | X | | | | | | | X | | | X | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | | | X | | X | | | | | | | | | | | X | | | X | X | X | X | | | | | | | | X | | | | | X | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | | | X | | X | | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | | | X | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/narumeia_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T12:51:32+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:05:15+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of narumeia/ナルメア (Granblue Fantasy) =========================================== This is the dataset of narumeia/ナルメア (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'horns, long\_hair, pointy\_ears, hair\_over\_one\_eye, breasts, large\_breasts, blue\_eyes, hair\_ornament, light\_purple\_hair, braid, very\_long\_hair, pink\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
e1e780f8168468b046c6ca0d736382e6e7dbf88a
# Dataset of gita/ジータ (Granblue Fantasy) This is the dataset of gita/ジータ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `short_hair, blonde_hair, brown_eyes, breasts, hairband, bangs, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 760.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 414.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1236 | 907.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 671.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1236 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gita_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/gita_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, collarbone, gauntlets, holding_sword, looking_at_viewer, pink_dress, short_sleeves, white_background, zettai_ryouiki, bow, brown_thighhighs, cleavage, hair_intakes, pink_skirt, puffy_sleeves, smile, solo, white_shirt, blush, closed_mouth, red_hairband, simple_background, thigh_boots, brown_footwear, cowboy_shot, open_mouth, pink_hairband, thighs, unsheathed, v-shaped_eyebrows | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, solo, black_gloves, smile, red_necktie, upper_body, simple_background, white_background, stethoscope, labcoat, sleeveless | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, pleated_skirt, witch_hat, looking_at_viewer, solo, thigh_boots, thighhighs, white_gloves, white_skirt, collarbone, puffy_short_sleeves, simple_background, blush, cleavage, neckerchief, shirt, smile, white_background, black_headwear, open_mouth, sailor_collar, staff, zettai_ryouiki | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, smile, solo, bare_shoulders, white_dress, armlet, bracelet, cleavage, large_breasts, sideboob, veil, white_background, blush, yellow_eyes, covered_nipples, open_mouth, revealing_clothes | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, alternate_costume, white_gloves, looking_at_viewer, puffy_short_sleeves, solo, skirt, hair_ribbon, open_mouth, blue_ribbon, hair_bow, blush, :d, one_eye_closed, simple_background, white_background | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, looking_at_viewer, sailor_collar, serafuku, solo, twin_braids, hair_bow, pleated_skirt, holding_bag, medium_hair, official_alternate_costume, pink_hairband, school_bag, simple_background, white_background, white_shirt, yellow_eyes, black_skirt, blush, petals, pink_neckerchief, puffy_short_sleeves, smile | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, fur_trim, looking_at_viewer, paw_gloves, solo, upper_body, blush, cat_hood, hood_up, open_mouth, :d, fake_animal_ears, black_gloves, cat_ears, long_sleeves, simple_background, white_background | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, midriff, navel, solo, hair_ornament, looking_at_viewer, crop_top, earrings, short_shorts, armor, black_hairband, cape, stomach, black_gloves, blush, cleavage_cutout, red_shorts, thighhighs, belt, closed_mouth, gauntlets, smile, thighs, ahoge, ear_piercing, groin, hair_between_eyes, holding, weapon | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, looking_at_viewer, red_eyes, solo, black_jacket, off_shoulder, x_hair_ornament, bare_shoulders, crop_top, midriff, navel, belt, black_choker, collarbone, earrings, simple_background, smile, tongue_out, fishnets, grey_hair, shorts, white_background, feather_boa, fruit, open_clothes, open_mouth, single_leg_pantyhose, skirt, black_nails, cleavage, holding, swept_bangs | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | blush, 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, navel, official_alternate_costume, smile, solo, beach, open_mouth, bikini, hair_ornament, innertube, thigh_strap, day, large_breasts, one-piece_swimsuit, blue_sky, outdoors, thighs | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, solo, hood_down, long_sleeves, looking_at_viewer, blue_eyes, blue_hair, hair_ornament, hooded_jacket, white_background, black_hairband, simple_background, smile, black_jacket, open_jacket, shirt, black_shorts, collarbone, jewelry, parted_lips, short_shorts, white_jacket | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, blush, braid, floral_print, hair_flower, holding_food, looking_at_viewer, upper_body, yukata, cotton_candy, obi, solo, eating, open_mouth, pink_kimono, print_kimono, fireworks, night, white_kimono, wide_sleeves, yellow_eyes | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, hair_flower, looking_at_viewer, obi, smile, solo, wide_sleeves, yukata, blush, floral_print, pink_kimono, upper_body, hair_bobbles, long_sleeves, print_kimono, twin_braids, aerial_fireworks, closed_mouth, food, holding, night_sky, open_mouth, outdoors, twitter_username, white_kimono, yellow_eyes | | 13 | 25 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | 1girl, fake_animal_ears, rabbit_ears, blush, looking_at_viewer, solo, hair_flower, wrist_cuffs, cape, playboy_bunny, rabbit_tail, smile, thighhighs, open_mouth, alternate_costume, white_leotard, cleavage, short_sleeves, simple_background, large_breasts, white_background | | 14 | 8 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | 1boy, 1girl, blush, hetero, solo_focus, mosaic_censoring, nipples, simple_background, open_mouth, white_background, completely_nude, penis, yellow_eyes, ass, collarbone, girl_on_top, large_breasts, navel, pink_hairband, pussy, sex, straddling | | 15 | 5 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | 1boy, 1girl, hetero, nipples, solo_focus, bikini, open_mouth, paizuri, smile, blush, looking_at_viewer, pov, yellow_eyes, fang, gigantic_breasts, huge_breasts, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | gauntlets | holding_sword | looking_at_viewer | pink_dress | short_sleeves | white_background | zettai_ryouiki | bow | brown_thighhighs | cleavage | hair_intakes | pink_skirt | puffy_sleeves | smile | solo | white_shirt | blush | closed_mouth | red_hairband | simple_background | thigh_boots | brown_footwear | cowboy_shot | open_mouth | pink_hairband | thighs | unsheathed | v-shaped_eyebrows | bare_shoulders | black_gloves | red_necktie | upper_body | stethoscope | labcoat | sleeveless | pleated_skirt | witch_hat | thighhighs | white_gloves | white_skirt | puffy_short_sleeves | neckerchief | shirt | black_headwear | sailor_collar | staff | white_dress | armlet | bracelet | large_breasts | sideboob | veil | yellow_eyes | covered_nipples | revealing_clothes | alternate_costume | skirt | hair_ribbon | blue_ribbon | hair_bow | :d | one_eye_closed | serafuku | twin_braids | holding_bag | medium_hair | official_alternate_costume | school_bag | black_skirt | petals | pink_neckerchief | fur_trim | paw_gloves | cat_hood | hood_up | fake_animal_ears | cat_ears | long_sleeves | midriff | navel | hair_ornament | crop_top | earrings | short_shorts | armor | black_hairband | cape | stomach | cleavage_cutout | red_shorts | belt | ahoge | ear_piercing | groin | hair_between_eyes | holding | weapon | red_eyes | black_jacket | off_shoulder | x_hair_ornament | black_choker | tongue_out | fishnets | grey_hair | shorts | feather_boa | fruit | open_clothes | single_leg_pantyhose | black_nails | swept_bangs | beach | bikini | innertube | thigh_strap | day | one-piece_swimsuit | blue_sky | outdoors | hood_down | blue_eyes | blue_hair | hooded_jacket | open_jacket | black_shorts | jewelry | parted_lips | white_jacket | braid | floral_print | hair_flower | holding_food | yukata | cotton_candy | obi | eating | pink_kimono | print_kimono | fireworks | night | white_kimono | wide_sleeves | hair_bobbles | aerial_fireworks | food | night_sky | twitter_username | rabbit_ears | wrist_cuffs | playboy_bunny | rabbit_tail | white_leotard | 1boy | hetero | solo_focus | mosaic_censoring | nipples | completely_nude | penis | ass | girl_on_top | pussy | sex | straddling | paizuri | pov | fang | gigantic_breasts | huge_breasts | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------------|:------------|:----------------|:--------------------|:-------------|:----------------|:-------------------|:-----------------|:------|:-------------------|:-----------|:---------------|:-------------|:----------------|:--------|:-------|:--------------|:--------|:---------------|:---------------|:--------------------|:--------------|:-----------------|:--------------|:-------------|:----------------|:---------|:-------------|:--------------------|:-----------------|:---------------|:--------------|:-------------|:--------------|:----------|:-------------|:----------------|:------------|:-------------|:---------------|:--------------|:----------------------|:--------------|:--------|:-----------------|:----------------|:--------|:--------------|:---------|:-----------|:----------------|:-----------|:-------|:--------------|:------------------|:--------------------|:--------------------|:--------|:--------------|:--------------|:-----------|:-----|:-----------------|:-----------|:--------------|:--------------|:--------------|:-----------------------------|:-------------|:--------------|:---------|:-------------------|:-----------|:-------------|:-----------|:----------|:-------------------|:-----------|:---------------|:----------|:--------|:----------------|:-----------|:-----------|:---------------|:--------|:-----------------|:-------|:----------|:------------------|:-------------|:-------|:--------|:---------------|:--------|:--------------------|:----------|:---------|:-----------|:---------------|:---------------|:------------------|:---------------|:-------------|:-----------|:------------|:---------|:--------------|:--------|:---------------|:-----------------------|:--------------|:--------------|:--------|:---------|:------------|:--------------|:------|:---------------------|:-----------|:-----------|:------------|:------------|:------------|:----------------|:--------------|:---------------|:----------|:--------------|:---------------|:--------|:---------------|:--------------|:---------------|:---------|:---------------|:------|:---------|:--------------|:---------------|:------------|:--------|:---------------|:---------------|:---------------|:-------------------|:-------|:------------|:-------------------|:--------------|:--------------|:----------------|:--------------|:----------------|:-------|:---------|:-------------|:-------------------|:----------|:------------------|:--------|:------|:--------------|:--------|:------|:-------------|:----------|:------|:-------|:-------------------|:---------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | X | | | X | | | | | | | | X | X | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | | | X | X | | | X | | | | X | X | | X | | | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | X | | | X | | | | X | | | | X | X | | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | | X | | | X | | | | | | | | | X | | X | | | X | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | X | | | X | | | | | | | | X | X | X | X | | | X | | | | | X | | | | | | | | | | | X | | | | | X | | | | X | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | X | | | X | | | | | | | | | X | | X | | | X | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | X | | | | | | | | | | | X | X | | X | X | | | | | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | X | | | X | | | | X | | | | X | X | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | X | | | | | | | X | | | | X | X | | X | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | | | X | | | X | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | | X | | | | | | | | | | | | X | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | | | | X | | | | | | | | | | | X | X | | X | X | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | X | | X | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 13 | 25 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | X | | | | X | | X | X | | | | X | | | | X | X | | X | | | X | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 14 | 8 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | X | X | | | | | | X | | | | | | | | | | | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | 15 | 5 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | X | | | | X | | | X | | | | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | | | | | | | | X | X | X | X | X |
CyberHarem/gita_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T12:51:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:05:03+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of gita/ジータ (Granblue Fantasy) ====================================== This is the dataset of gita/ジータ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'short\_hair, blonde\_hair, brown\_eyes, breasts, hairband, bangs, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
7e8f489faddc06ca0fb2f79e721d4799bc003a42
# Dataset of cagliostro/カリオストロ (Granblue Fantasy) This is the dataset of cagliostro/カリオストロ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `blonde_hair, long_hair, purple_eyes, bangs, hairband, crown, bow, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 697.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cagliostro_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 402.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cagliostro_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1201 | 853.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cagliostro_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 622.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cagliostro_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1201 | 1.17 GiB | [Download](https://huggingface.co/datasets/CyberHarem/cagliostro_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/cagliostro_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 27 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, 1boy, hetero, solo_focus, open_mouth, censored, nipples, penis, smile, thighhighs, looking_at_viewer, cum, nude, small_breasts, pussy, sex, vaginal, cape, heart | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_thighhighs, boots, cape, sitting, skirt, solo, open_mouth, smile, looking_at_viewer, blush, chair, crossed_legs, jewelry | | 2 | 18 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_thighhighs, cape, looking_at_viewer, smile, solo, blush, red_skirt, simple_background, book, white_background, zettai_ryouiki, boots, open_mouth | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, simple_background, solo, upper_body, white_background, blush, closed_mouth, red_bow, brooch, smile, blunt_bangs, bowtie, one_eye_closed, red_cape | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blue_skirt, long_sleeves, looking_at_viewer, official_alternate_costume, school_uniform, smile, solo, white_thighhighs, zettai_ryouiki, blush, plaid_skirt, ribbon, simple_background, white_background, open_mouth, very_long_hair, bag, blazer, blue_jacket, book, collared_shirt, frills, full_body, hand_up, open_jacket, pleated_skirt, shoes, sitting, white_shirt | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, cape, hood, looking_at_viewer, official_alternate_costume, puffy_short_sleeves, smile, solo, black_thighhighs, orange_bowtie, orange_skirt, frilled_skirt, holding, one_eye_closed, pumpkin, striped, suspenders, teeth, white_shirt, book, candy, halloween_costume, jack-o'-lantern, mismatched_legwear, star_(symbol) | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, looking_at_viewer, smile, solo, star_(symbol), bare_shoulders, blush, white_dress, closed_mouth, detached_sleeves, frills, upper_body | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, looking_at_viewer, solo, white_dress, closed_mouth, long_sleeves, very_long_hair, white_rose, bare_shoulders, blush, frilled_dress, hair_flower, smile, star_(symbol), off-shoulder_dress | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, eyewear_on_head, looking_at_viewer, official_alternate_costume, ponytail, solo, sunglasses, blue_one-piece_swimsuit, heart-shaped_eyewear, blush, hair_ornament, open_mouth, thigh_strap, flower, grin, simple_background, white_background | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, eyewear_on_head, looking_at_viewer, official_alternate_costume, solo, sunglasses, thigh_strap, blue_one-piece_swimsuit, day, heart-shaped_eyewear, ocean, outdoors, ponytail, blue_sky, blush, water, cloud, collarbone, grin, hair_flower, very_long_hair, bare_shoulders, open_mouth, sailor_collar | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | 1boy | hetero | solo_focus | open_mouth | censored | nipples | penis | smile | thighhighs | looking_at_viewer | cum | nude | small_breasts | pussy | sex | vaginal | cape | heart | black_thighhighs | boots | sitting | skirt | solo | chair | crossed_legs | jewelry | red_skirt | simple_background | book | white_background | zettai_ryouiki | upper_body | closed_mouth | red_bow | brooch | blunt_bangs | bowtie | one_eye_closed | red_cape | blue_skirt | long_sleeves | official_alternate_costume | school_uniform | white_thighhighs | plaid_skirt | ribbon | very_long_hair | bag | blazer | blue_jacket | collared_shirt | frills | full_body | hand_up | open_jacket | pleated_skirt | shoes | white_shirt | hood | puffy_short_sleeves | orange_bowtie | orange_skirt | frilled_skirt | holding | pumpkin | striped | suspenders | teeth | candy | halloween_costume | jack-o'-lantern | mismatched_legwear | star_(symbol) | bare_shoulders | white_dress | detached_sleeves | white_rose | frilled_dress | hair_flower | off-shoulder_dress | eyewear_on_head | ponytail | sunglasses | blue_one-piece_swimsuit | heart-shaped_eyewear | hair_ornament | thigh_strap | flower | grin | day | ocean | outdoors | blue_sky | water | cloud | collarbone | sailor_collar | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:---------|:-------------|:-------------|:-----------|:----------|:--------|:--------|:-------------|:--------------------|:------|:-------|:----------------|:--------|:------|:----------|:-------|:--------|:-------------------|:--------|:----------|:--------|:-------|:--------|:---------------|:----------|:------------|:--------------------|:-------|:-------------------|:-----------------|:-------------|:---------------|:----------|:---------|:--------------|:---------|:-----------------|:-----------|:-------------|:---------------|:-----------------------------|:-----------------|:-------------------|:--------------|:---------|:-----------------|:------|:---------|:--------------|:-----------------|:---------|:------------|:----------|:--------------|:----------------|:--------|:--------------|:-------|:----------------------|:----------------|:---------------|:----------------|:----------|:----------|:----------|:-------------|:--------|:--------|:--------------------|:------------------|:---------------------|:----------------|:-----------------|:--------------|:-------------------|:-------------|:----------------|:--------------|:---------------------|:------------------|:-----------|:-------------|:--------------------------|:-----------------------|:----------------|:--------------|:---------|:-------|:------|:--------|:-----------|:-----------|:--------|:--------|:-------------|:----------------| | 0 | 27 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | X | | | | X | | X | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 18 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | | | | X | | X | | | | | | | X | | X | X | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | X | | | | X | | X | | | | | | | | | | | X | | X | | | | | X | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | | | | | | X | | X | | | | | | | X | | X | | | | X | | | | | | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | 8 | 12 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | X | | | | | | X | | | | | | | | | | | | | X | | | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X |
CyberHarem/cagliostro_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T12:51:47+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:46:37+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of cagliostro/カリオストロ (Granblue Fantasy) =============================================== This is the dataset of cagliostro/カリオストロ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, long\_hair, purple\_eyes, bangs, hairband, crown, bow, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
771d2097d00474c4d11a0610c7026c533415243b
# Dataset of vikala/ビカラ (Granblue Fantasy) This is the dataset of vikala/ビカラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `red_eyes, bangs, animal_ears, mouse_ears, hair_ornament, short_hair, bow, fake_animal_ears, white_hair, hairclip, hair_bow, hairband, red_bow, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 921.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 465.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1362 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 793.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1362 | 1.60 GiB | [Download](https://huggingface.co/datasets/CyberHarem/vikala_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/vikala_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, black_hair, blush, hair_bobbles, strapless_shirt, sun_hat, short_twintails, wrist_scrunchie, bare_shoulders, low_twintails, navel, midriff, black_shorts, crop_top, black_shirt, collarbone, short_shorts, water, closed_mouth, shoulder_bag, small_breasts, white_jacket, mouse, outdoors, smile | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1boy, 1girl, blush, hetero, open_mouth, penis, sex, vaginal, nipples, solo_focus, navel, cum_in_pussy, spread_legs, bar_censor, small_breasts, bikini, looking_at_viewer, medium_breasts, mosaic_censoring | | 2 | 25 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bowtie, crop_top, heart_brooch, long_sleeves, looking_at_viewer, midriff, solo, white_shirt, white_skirt, wide_sleeves, navel, pleated_skirt, miniskirt, collared_shirt, open_mouth, mouse, :d, blush, animal, cowboy_shot, frilled_skirt, holding_balloon, white_background, grey_hair | | 3 | 60 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, solo, eyewear_on_head, hair_flower, blush, striped_bikini, sunglasses, navel, bare_shoulders, open_mouth, outdoors, white_skirt, wrist_scrunchie, small_breasts, day, water, bikini_skirt, cleavage, :d, ocean, blue_sky, bridal_garter, choker, frills | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_hair, long_sleeves, black_skirt, looking_at_viewer, blush, pleated_skirt, sailor_collar, closed_mouth, mouse, solo, blue_bow, blue_jacket, sleeves_past_wrists, white_background, white_shirt, white_thighhighs, bag, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_hair | blush | hair_bobbles | strapless_shirt | sun_hat | short_twintails | wrist_scrunchie | bare_shoulders | low_twintails | navel | midriff | black_shorts | crop_top | black_shirt | collarbone | short_shorts | water | closed_mouth | shoulder_bag | small_breasts | white_jacket | mouse | outdoors | smile | 1boy | hetero | open_mouth | penis | sex | vaginal | nipples | solo_focus | cum_in_pussy | spread_legs | bar_censor | bikini | medium_breasts | mosaic_censoring | bowtie | heart_brooch | long_sleeves | white_shirt | white_skirt | wide_sleeves | pleated_skirt | miniskirt | collared_shirt | :d | animal | cowboy_shot | frilled_skirt | holding_balloon | white_background | grey_hair | eyewear_on_head | hair_flower | striped_bikini | sunglasses | day | bikini_skirt | cleavage | ocean | blue_sky | bridal_garter | choker | frills | black_skirt | sailor_collar | blue_bow | blue_jacket | sleeves_past_wrists | white_thighhighs | bag | simple_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------|:--------|:---------------|:------------------|:----------|:------------------|:------------------|:-----------------|:----------------|:--------|:----------|:---------------|:-----------|:--------------|:-------------|:---------------|:--------|:---------------|:---------------|:----------------|:---------------|:--------|:-----------|:--------|:-------|:---------|:-------------|:--------|:------|:----------|:----------|:-------------|:---------------|:--------------|:-------------|:---------|:-----------------|:-------------------|:---------|:---------------|:---------------|:--------------|:--------------|:---------------|:----------------|:------------|:-----------------|:-----|:---------|:--------------|:----------------|:------------------|:-------------------|:------------|:------------------|:--------------|:-----------------|:-------------|:------|:---------------|:-----------|:--------|:-----------|:----------------|:---------|:---------|:--------------|:----------------|:-----------|:--------------|:----------------------|:-------------------|:------|:--------------------| | 0 | 23 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | X | | | | | | | | X | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 25 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | | | | | | | | X | X | | X | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 3 | 60 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | | | | X | X | | X | | | | | | | X | | | X | | | X | | | | X | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
CyberHarem/vikala_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T12:53:17+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:19:29+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of vikala/ビカラ (Granblue Fantasy) ======================================== This is the dataset of vikala/ビカラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'red\_eyes, bangs, animal\_ears, mouse\_ears, hair\_ornament, short\_hair, bow, fake\_animal\_ears, white\_hair, hairclip, hair\_bow, hairband, red\_bow, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
846f5c6426aa9242c407a681c752a609dfaeb981
# Dataset of anila/アニラ (Granblue Fantasy) This is the dataset of anila/アニラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `blonde_hair, long_hair, horns, breasts, sheep_horns, ahoge, bangs, thick_eyebrows, yellow_eyes, large_breasts, short_eyebrows, very_long_hair, blunt_bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 821.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anila_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 437.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anila_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1307 | 995.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anila_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 714.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anila_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1307 | 1.46 GiB | [Download](https://huggingface.co/datasets/CyberHarem/anila_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/anila_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 63 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, draph, official_alternate_costume, solo, blush, cleavage, looking_at_viewer, white_bikini, bare_shoulders, collarbone, detached_sleeves, smile, navel, thighs, open_mouth, layered_bikini, simple_background, white_background | | 1 | 51 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, draph, solo, looking_at_viewer, cleavage, smile, white_thighhighs, pleated_skirt, white_gloves, blush, black_skirt, fur_trim, simple_background, cape, open_mouth, white_background, miniskirt | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, draph, fur_trim, solo, upper_body, white_gloves, blush, cleavage, looking_at_viewer, smile, :3, breast_hold, simple_background, white_background | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, blush, cleavage, draph, kimono, ponytail, looking_at_viewer, off_shoulder, smile, solo, collarbone, simple_background, wide_sleeves, white_background, obi | | 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, draph, nipples, blush, collarbone, navel, solo, looking_at_viewer, completely_nude, open_mouth, smile, sitting | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, draph, hetero, nipples, paizuri, solo_focus, smile, looking_at_viewer, sweat, breasts_squeezed_together, penis, pov, huge_breasts, brown_eyes, cum_on_breasts, mosaic_censoring, open_mouth | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, blush, draph, hetero, open_mouth, 1girl, penis, smile, solo_focus, bar_censor, bikini, looking_at_viewer, nipples, official_alternate_costume, sweat, handjob | | 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1boy, 1girl, blush, draph, hetero, nipples, sex, solo_focus, cowgirl_position, girl_on_top, navel, vaginal, open_mouth, penis, sweat, cum_in_pussy, smile, completely_nude, pov, censored, looking_at_viewer, spread_legs | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, draph, looking_at_viewer, school_uniform, short_sleeves, solo, black_skirt, curled_horns, navel, pleated_skirt, white_background, white_shirt, alternate_costume, midriff, simple_background, bow, brown_eyes, crop_top_overhang, glasses, red-framed_eyewear, sailor_collar, thighs, underboob | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | official_alternate_costume | solo | blush | cleavage | looking_at_viewer | white_bikini | bare_shoulders | collarbone | detached_sleeves | smile | navel | thighs | open_mouth | layered_bikini | simple_background | white_background | white_thighhighs | pleated_skirt | white_gloves | black_skirt | fur_trim | cape | miniskirt | upper_body | :3 | breast_hold | kimono | ponytail | off_shoulder | wide_sleeves | obi | nipples | completely_nude | sitting | 1boy | hetero | paizuri | solo_focus | sweat | breasts_squeezed_together | penis | pov | huge_breasts | brown_eyes | cum_on_breasts | mosaic_censoring | bar_censor | bikini | handjob | sex | cowgirl_position | girl_on_top | vaginal | cum_in_pussy | censored | spread_legs | school_uniform | short_sleeves | curled_horns | white_shirt | alternate_costume | midriff | bow | crop_top_overhang | glasses | red-framed_eyewear | sailor_collar | underboob | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------------------|:-------|:--------|:-----------|:--------------------|:---------------|:-----------------|:-------------|:-------------------|:--------|:--------|:---------|:-------------|:-----------------|:--------------------|:-------------------|:-------------------|:----------------|:---------------|:--------------|:-----------|:-------|:------------|:-------------|:-----|:--------------|:---------|:-----------|:---------------|:---------------|:------|:----------|:------------------|:----------|:-------|:---------|:----------|:-------------|:--------|:----------------------------|:--------|:------|:---------------|:-------------|:-----------------|:-------------------|:-------------|:---------|:----------|:------|:-------------------|:--------------|:----------|:---------------|:-----------|:--------------|:-----------------|:----------------|:---------------|:--------------|:--------------------|:----------|:------|:--------------------|:----------|:---------------------|:----------------|:------------| | 0 | 63 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 51 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | X | | | | | X | | | | | X | X | | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | X | X | | X | X | | X | | | | | X | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | X | | X | | | X | | X | X | | X | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | X | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | X | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | X | X | | X | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | X | | X | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | X | | X | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | X | X | | X | | | | | | X | X | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/anila_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:05:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:08:04+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of anila/アニラ (Granblue Fantasy) ======================================= This is the dataset of anila/アニラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, long\_hair, horns, breasts, sheep\_horns, ahoge, bangs, thick\_eyebrows, yellow\_eyes, large\_breasts, short\_eyebrows, very\_long\_hair, blunt\_bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
0dcaa11eba467d87555583f9ae6b9f41dd90fd08
# Dataset of clarisse/クラリス (Granblue Fantasy) This is the dataset of clarisse/クラリス (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `long_hair, breasts, ribbon, ponytail, hair_ribbon, green_eyes, brown_hair, medium_breasts, orange_hair, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 678.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 403.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1220 | 849.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 609.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1220 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/clarisse_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/clarisse_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_gloves, skirt, solo, cape, looking_at_viewer, smile, black_thighhighs, one_eye_closed, open_mouth, ;d, book, boots, blush, v_over_eye, sleeveless | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, :d, black_gloves, black_ribbon, black_thighhighs, cape, looking_at_viewer, open_mouth, sleeveless, solo, sideboob, simple_background, very_long_hair, white_background, blush, red_skirt, test_tube, black_footwear, high_heel_boots, holding_book, knee_boots, open_book | | 2 | 26 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | cape, 1girl, black_gloves, christmas, santa_hat, solo, navel, black_thighhighs, blush, fur_trim, looking_at_viewer, cleavage, open_mouth, santa_bikini, one_eye_closed, boots, red_bikini, very_long_hair, :d | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, detached_sleeves, hairband, long_sleeves, looking_at_viewer, solo, white_background, white_shirt, bare_shoulders, closed_mouth, red_skirt, simple_background, very_long_hair, bow, low_twintails, sleeveless_shirt, white_sweater, plaid_skirt, red_ribbon, sleeves_past_wrists, smile, thighhighs, turtleneck | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, hairband, looking_at_viewer, red_skirt, solo, bare_shoulders, detached_sleeves, plaid_skirt, valentine, very_long_hair, aqua_eyes, holding, long_sleeves, thighhighs, white_background, apron, scarf, simple_background, smile, closed_mouth, gift, heart-shaped_box, large_breasts, twintails, white_shirt | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, red_bikini, solo, hair_flower, cleavage, navel, very_long_hair, smile, beach, bracelet, cloud, collarbone, frilled_bikini, open_mouth, outdoors, sarong, sky | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, solo, black_thighhighs, turtleneck, sleeveless_shirt, very_long_hair, white_panties, armpits, closed_mouth, on_back, side-tie_panties, skirt_lift, smile, sweater, swept_bangs, white_shirt | | 7 | 23 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, hair_bow, solo, looking_at_viewer, official_alternate_costume, hair_flower, blush, chest_harness, white_dress, black_gloves, white_bow, elbow_gloves, white_background, bare_shoulders, earrings, red_rose, smile, pantyhose | | 8 | 21 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, hetero, solo_focus, 1boy, penis, nipples, open_mouth, pussy, thighhighs, large_breasts, bar_censor, sex, vaginal, sweat, spread_legs, cum, female_pubic_hair, gloves | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | skirt | solo | cape | looking_at_viewer | smile | black_thighhighs | one_eye_closed | open_mouth | ;d | book | boots | blush | v_over_eye | sleeveless | :d | black_ribbon | sideboob | simple_background | very_long_hair | white_background | red_skirt | test_tube | black_footwear | high_heel_boots | holding_book | knee_boots | open_book | christmas | santa_hat | navel | fur_trim | cleavage | santa_bikini | red_bikini | detached_sleeves | hairband | long_sleeves | white_shirt | bare_shoulders | closed_mouth | bow | low_twintails | sleeveless_shirt | white_sweater | plaid_skirt | red_ribbon | sleeves_past_wrists | thighhighs | turtleneck | valentine | aqua_eyes | holding | apron | scarf | gift | heart-shaped_box | large_breasts | twintails | hair_flower | beach | bracelet | cloud | collarbone | frilled_bikini | outdoors | sarong | sky | white_panties | armpits | on_back | side-tie_panties | skirt_lift | sweater | swept_bangs | hair_bow | official_alternate_costume | chest_harness | white_dress | white_bow | elbow_gloves | earrings | red_rose | pantyhose | hetero | solo_focus | 1boy | penis | nipples | pussy | bar_censor | sex | vaginal | sweat | spread_legs | cum | female_pubic_hair | gloves | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:-------|:--------------------|:--------|:-------------------|:-----------------|:-------------|:-----|:-------|:--------|:--------|:-------------|:-------------|:-----|:---------------|:-----------|:--------------------|:-----------------|:-------------------|:------------|:------------|:-----------------|:------------------|:---------------|:-------------|:------------|:------------|:------------|:--------|:-----------|:-----------|:---------------|:-------------|:-------------------|:-----------|:---------------|:--------------|:-----------------|:---------------|:------|:----------------|:-------------------|:----------------|:--------------|:-------------|:----------------------|:-------------|:-------------|:------------|:------------|:----------|:--------|:--------|:-------|:-------------------|:----------------|:------------|:--------------|:--------|:-----------|:--------|:-------------|:-----------------|:-----------|:---------|:------|:----------------|:----------|:----------|:-------------------|:-------------|:----------|:--------------|:-----------|:-----------------------------|:----------------|:--------------|:------------|:---------------|:-----------|:-----------|:------------|:---------|:-------------|:-------|:--------|:----------|:--------|:-------------|:------|:----------|:--------|:--------------|:------|:--------------------|:---------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 26 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | | X | X | X | | | X | X | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | X | X | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | X | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | X | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | X | X | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 23 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | | X | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 8 | 21 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/clarisse_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:05:55+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:58:19+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of clarisse/クラリス (Granblue Fantasy) =========================================== This is the dataset of clarisse/クラリス (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'long\_hair, breasts, ribbon, ponytail, hair\_ribbon, green\_eyes, brown\_hair, medium\_breasts, orange\_hair, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
ea12f89f5dc31778a636b872dd61157f030793a6
# Dataset of zeta/ゼタ (Granblue Fantasy) This is the dataset of zeta/ゼタ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `blonde_hair, long_hair, breasts, blue_eyes, twintails, hairband, large_breasts, bangs, hair_intakes, braid`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 819.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 448.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1276 | 976.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 727.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1276 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zeta_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/zeta_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, cleavage, looking_at_viewer, smile, thighhighs, midriff, navel, skirt, belt, spear, gauntlets, red_armor, holding, blush | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, holding_weapon, looking_at_viewer, solo, smile, white_background, simple_background, red_armor, spear, cleavage, sketch | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage, looking_at_viewer, simple_background, solo, white_background, blush, gauntlets, red_armor, upper_body, closed_mouth, grin, hair_ornament | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, belt, cleavage, crop_top, looking_at_viewer, midriff, navel, solo, sunglasses, black_skirt, collarbone, eyewear_on_head, pleated_skirt, smile, blush, miniskirt, off_shoulder, polearm, thighhighs, white_background, boots, coat, fur-trimmed_jacket, green_jacket, holding, long_sleeves, open_clothes, open_mouth, tank_top, thighs, zettai_ryouiki | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | black_gloves, cleavage, detached_sleeves, halloween_costume, looking_at_viewer, smile, witch_hat, 1girl, jack-o'-lantern, midriff, navel, official_alternate_costume, pumpkin, bare_shoulders, solo, striped, thighhighs, white_background, candy, halloween_bucket, open_mouth | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bracelet, cleavage, eyewear_on_head, looking_at_viewer, navel, official_alternate_costume, red_bikini, side-tie_bikini_bottom, sunglasses, shawl, solo, hair_flower, o-ring, medium_breasts, open_mouth, thighs, blush, collarbone, grin, polearm | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, beach, bracelet, cleavage, day, eyewear_on_head, looking_at_viewer, official_alternate_costume, outdoors, red_bikini, side-tie_bikini_bottom, solo, sunglasses, blush, navel, ocean, smile, blue_sky, hair_flower, bare_shoulders, collarbone, shawl, thighs | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, earrings, looking_at_viewer, red_dress, smile, solo, medium_breasts, blush, bracelet, cleavage_cutout, hair_down, sleeveless_dress | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, looking_at_viewer, solo, blush, obi, red_kimono, smile, hair_flower, open_mouth, wide_sleeves, yukata | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | smile | thighhighs | midriff | navel | skirt | belt | spear | gauntlets | red_armor | holding | blush | holding_weapon | white_background | simple_background | sketch | upper_body | closed_mouth | grin | hair_ornament | bare_shoulders | crop_top | sunglasses | black_skirt | collarbone | eyewear_on_head | pleated_skirt | miniskirt | off_shoulder | polearm | boots | coat | fur-trimmed_jacket | green_jacket | long_sleeves | open_clothes | open_mouth | tank_top | thighs | zettai_ryouiki | black_gloves | detached_sleeves | halloween_costume | witch_hat | jack-o'-lantern | official_alternate_costume | pumpkin | striped | candy | halloween_bucket | bracelet | red_bikini | side-tie_bikini_bottom | shawl | hair_flower | o-ring | medium_breasts | beach | day | outdoors | ocean | blue_sky | earrings | red_dress | cleavage_cutout | hair_down | sleeveless_dress | obi | red_kimono | wide_sleeves | yukata | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------|:-------------|:----------|:--------|:--------|:-------|:--------|:------------|:------------|:----------|:--------|:-----------------|:-------------------|:--------------------|:---------|:-------------|:---------------|:-------|:----------------|:-----------------|:-----------|:-------------|:--------------|:-------------|:------------------|:----------------|:------------|:---------------|:----------|:--------|:-------|:---------------------|:---------------|:---------------|:---------------|:-------------|:-----------|:---------|:-----------------|:---------------|:-------------------|:--------------------|:------------|:------------------|:-----------------------------|:----------|:----------|:--------|:-------------------|:-----------|:-------------|:-------------------------|:--------|:--------------|:---------|:-----------------|:--------|:------|:-----------|:--------|:-----------|:-----------|:------------|:------------------|:------------|:-------------------|:------|:-------------|:---------------|:---------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | | | | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | | | | | | | X | X | | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | | X | | | | X | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | | | | X | | | | | | | X | | | | | | | X | | | | X | | X | X | | | | X | | | | | | | X | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | | X | | | | | | | X | | | | | | | | | X | | X | | X | X | | | | | | | | | | | | | X | | | | | | | X | | | | | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | X | X | X | X | X | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X |
CyberHarem/zeta_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:06:00+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:57:17+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of zeta/ゼタ (Granblue Fantasy) ===================================== This is the dataset of zeta/ゼタ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, long\_hair, breasts, blue\_eyes, twintails, hairband, large\_breasts, bangs, hair\_intakes, braid', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
8ce8e6cee24317e18c6793d45ec6b98a70e4a96f
# Dataset of lyria/ルリア (Granblue Fantasy) This is the dataset of lyria/ルリア (Granblue Fantasy), containing 484 images and their tags. The core tags of this character are `blue_hair, long_hair, very_long_hair, blue_eyes, ahoge`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 484 | 560.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyria_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 484 | 360.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyria_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 968 | 670.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyria_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 484 | 510.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyria_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 968 | 901.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lyria_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lyria_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, closed_mouth, simple_background, solo, upper_body, white_background, white_dress, blush, collarbone, looking_at_viewer, smile, gem, white_choker, bridal_gauntlets | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, earmuffs, solo, upper_body, blue_scarf, closed_mouth, looking_at_viewer, smile, white_dress, blush, simple_background, white_background | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blue_sky, cloud, day, official_alternate_costume, outdoors, solo, straw_hat, twin_braids, :d, blush, floating_hair, open_mouth, star_hair_ornament, sun_hat, wrist_cuffs, blue_one-piece_swimsuit, looking_at_viewer, ocean, standing, bare_shoulders, blue_dress, closed_eyes, collarbone, hair_flower, see-through, shell_hair_ornament, sleeveless_dress, twitter_username, white_dress, white_flower | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, dress, full_body, hair_ornament, single_thighhigh, solo, looking_at_viewer, open_mouth, detached_sleeves, idol, microphone, :d, blue_footwear, collarbone, one_eye_closed, simple_background, uneven_legwear, white_background | | 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, hetero, 1boy, nipples, simple_background, grey_background, navel, solo_focus, open_mouth, pussy, shiny, spread_legs, completely_nude, penis, small_breasts, uncensored, collarbone, lying, sex, vaginal | | 5 | 15 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, choker, solo, braid, fingerless_gloves, smile, earrings, looking_at_viewer, ponytail, blush, hair_ribbon, heart, skirt, white_shirt, holding_microphone, official_alternate_costume, simple_background, white_background, fishnets, black_gloves, microphone_stand, midriff, open_mouth, short_sleeves, hair_ornament, navel, pantyhose | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | closed_mouth | simple_background | solo | upper_body | white_background | white_dress | blush | collarbone | looking_at_viewer | smile | gem | white_choker | bridal_gauntlets | earmuffs | blue_scarf | blue_sky | cloud | day | official_alternate_costume | outdoors | straw_hat | twin_braids | :d | floating_hair | open_mouth | star_hair_ornament | sun_hat | wrist_cuffs | blue_one-piece_swimsuit | ocean | standing | blue_dress | closed_eyes | hair_flower | see-through | shell_hair_ornament | sleeveless_dress | twitter_username | white_flower | dress | full_body | hair_ornament | single_thighhigh | detached_sleeves | idol | microphone | blue_footwear | one_eye_closed | uneven_legwear | hetero | 1boy | nipples | grey_background | navel | solo_focus | pussy | shiny | spread_legs | completely_nude | penis | small_breasts | uncensored | lying | sex | vaginal | choker | braid | fingerless_gloves | earrings | ponytail | hair_ribbon | heart | skirt | white_shirt | holding_microphone | fishnets | black_gloves | microphone_stand | midriff | short_sleeves | pantyhose | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:-------|:-------------|:-------------------|:--------------|:--------|:-------------|:--------------------|:--------|:------|:---------------|:-------------------|:-----------|:-------------|:-----------|:--------|:------|:-----------------------------|:-----------|:------------|:--------------|:-----|:----------------|:-------------|:---------------------|:----------|:--------------|:--------------------------|:--------|:-----------|:-------------|:--------------|:--------------|:--------------|:----------------------|:-------------------|:-------------------|:---------------|:--------|:------------|:----------------|:-------------------|:-------------------|:-------|:-------------|:----------------|:-----------------|:-----------------|:---------|:-------|:----------|:------------------|:--------|:-------------|:--------|:--------|:--------------|:------------------|:--------|:----------------|:-------------|:--------|:------|:----------|:---------|:--------|:--------------------|:-----------|:-----------|:--------------|:--------|:--------|:--------------|:---------------------|:-----------|:---------------|:-------------------|:----------|:----------------|:------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | X | X | X | X | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | | | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | | X | | | X | X | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 5 | 15 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | X | | X | | X | X | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/lyria_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:06:04+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:50:01+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of lyria/ルリア (Granblue Fantasy) ======================================= This is the dataset of lyria/ルリア (Granblue Fantasy), containing 484 images and their tags. The core tags of this character are 'blue\_hair, long\_hair, very\_long\_hair, blue\_eyes, ahoge', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
5bf0b421017691ef8185e9ec46cf7ab7296d5e17
# Deities-25 The dataset comprises of a comprehensive collection of 8,239 images showcasing diverse forms and iconographies of 25 Indic deities. This dataset is a unique blend of manually curated and web-scraped visuals, providing a valuable resource for the computer vision community interested in exploring the artistic and cultural expressions embedded in the visual representation of deities. # Supported Tasks - `image-classification`: The goal of this task is to classify a given image of a deity into one of 25 classes. ## Uses ### Direct Use - *Cultural Awareness*: Raise awareness about the rich cultural heritage of the Indian subcontinent by incorporating these diverse depictions of Indic deities into educational materials. - *Research and Preservation*: Contribute to academic research in the fields of art history, cultural studies, and anthropology. The dataset serves as a valuable resource for preserving and studying the visual representations of revered figures. - *Deep learning research*: Offers exciting opportunities for multi-label classification tasks. However, a challenge in this domain is dealing with inter-class similarity, where images from different categories share common features. ### Source Data Social media posts, smartphone camera captures, images generated using diffusion methods. #### Data Collection and Processing We carefully selected diverse images for the dataset and used the `cleanvision` library from cleanlab to remove images with issues. A custom Python script helped organize the data effectively. When it came to training our model, we relied on torchvision transforms to prepare our dataset for training. ## Dataset Structure ```json DatasetDict({ train: Dataset({ features: ['image', 'label'], num_rows: 6583 }) validation: Dataset({ features: ['image', 'label'], num_rows: 1656 }) }) ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 6583 | | valid | 1656 | ## Bias, Risks, and Limitations - *Bias* - The dataset primarily represents Indic deities, potentially introducing a cultural bias. Efforts were made to include diverse forms, but the dataset may not fully encapsulate the breadth of artistic expressions across different Indic cultures. - *Risks* - Images of deities can be open to various interpretations. The dataset may not capture nuanced meanings, leading to potential misinterpretations by users.
Yegiiii/deities-25
[ "task_categories:image-classification", "size_categories:1K<n<10K", "language:en", "license:apache-2.0", "art", "heritage", "culture", "iconography", "region:us" ]
2024-01-21T13:17:08+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "pretty_name": "Deities", "tags": ["art", "heritage", "culture", "iconography"]}
2024-01-21T16:49:34+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #art #heritage #culture #iconography #region-us
Deities-25 ========== The dataset comprises of a comprehensive collection of 8,239 images showcasing diverse forms and iconographies of 25 Indic deities. This dataset is a unique blend of manually curated and web-scraped visuals, providing a valuable resource for the computer vision community interested in exploring the artistic and cultural expressions embedded in the visual representation of deities. Supported Tasks =============== * 'image-classification': The goal of this task is to classify a given image of a deity into one of 25 classes. Uses ---- ### Direct Use * *Cultural Awareness*: Raise awareness about the rich cultural heritage of the Indian subcontinent by incorporating these diverse depictions of Indic deities into educational materials. * *Research and Preservation*: Contribute to academic research in the fields of art history, cultural studies, and anthropology. The dataset serves as a valuable resource for preserving and studying the visual representations of revered figures. * *Deep learning research*: Offers exciting opportunities for multi-label classification tasks. However, a challenge in this domain is dealing with inter-class similarity, where images from different categories share common features. ### Source Data Social media posts, smartphone camera captures, images generated using diffusion methods. #### Data Collection and Processing We carefully selected diverse images for the dataset and used the 'cleanvision' library from cleanlab to remove images with issues. A custom Python script helped organize the data effectively. When it came to training our model, we relied on torchvision transforms to prepare our dataset for training. Dataset Structure ----------------- ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: Bias, Risks, and Limitations ---------------------------- * *Bias* - The dataset primarily represents Indic deities, potentially introducing a cultural bias. Efforts were made to include diverse forms, but the dataset may not fully encapsulate the breadth of artistic expressions across different Indic cultures. * *Risks* - Images of deities can be open to various interpretations. The dataset may not capture nuanced meanings, leading to potential misinterpretations by users.
[ "### Direct Use\n\n\n* *Cultural Awareness*: Raise awareness about the rich cultural heritage of the Indian subcontinent by incorporating these diverse depictions of Indic deities into educational materials.\n* *Research and Preservation*: Contribute to academic research in the fields of art history, cultural studies, and anthropology. The dataset serves as a valuable resource for preserving and studying the visual representations of revered figures.\n* *Deep learning research*: Offers exciting opportunities for multi-label classification tasks. However, a challenge in this domain is dealing with inter-class similarity, where images from different categories share common features.", "### Source Data\n\n\nSocial media posts, smartphone camera captures, images generated using diffusion methods.", "#### Data Collection and Processing\n\n\nWe carefully selected diverse images for the dataset and used the 'cleanvision' library from cleanlab to remove images with issues. A custom Python script helped organize the data effectively. When it came to training our model, we relied on torchvision transforms to prepare our dataset for training.\n\n\nDataset Structure\n-----------------", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:\n\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\n* *Bias* - The dataset primarily represents Indic deities, potentially introducing a cultural bias. Efforts were made to include diverse forms, but the dataset may not fully encapsulate the breadth of artistic expressions across different Indic cultures.\n* *Risks* - Images of deities can be open to various interpretations. The dataset may not capture nuanced meanings, leading to potential misinterpretations by users." ]
[ "TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #art #heritage #culture #iconography #region-us \n", "### Direct Use\n\n\n* *Cultural Awareness*: Raise awareness about the rich cultural heritage of the Indian subcontinent by incorporating these diverse depictions of Indic deities into educational materials.\n* *Research and Preservation*: Contribute to academic research in the fields of art history, cultural studies, and anthropology. The dataset serves as a valuable resource for preserving and studying the visual representations of revered figures.\n* *Deep learning research*: Offers exciting opportunities for multi-label classification tasks. However, a challenge in this domain is dealing with inter-class similarity, where images from different categories share common features.", "### Source Data\n\n\nSocial media posts, smartphone camera captures, images generated using diffusion methods.", "#### Data Collection and Processing\n\n\nWe carefully selected diverse images for the dataset and used the 'cleanvision' library from cleanlab to remove images with issues. A custom Python script helped organize the data effectively. When it came to training our model, we relied on torchvision transforms to prepare our dataset for training.\n\n\nDataset Structure\n-----------------", "### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:\n\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\n* *Bias* - The dataset primarily represents Indic deities, potentially introducing a cultural bias. Efforts were made to include diverse forms, but the dataset may not fully encapsulate the breadth of artistic expressions across different Indic cultures.\n* *Risks* - Images of deities can be open to various interpretations. The dataset may not capture nuanced meanings, leading to potential misinterpretations by users." ]
d5a9ea44d1b2a002cc3c5766f916eacf3b6e93cc
# SimulateBench: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation. <!-- Provide a quick summary of the dataset. --> Human behavior simulation of AI agents necessitates that the agents possess a quality of believability, which is crucial as it facilitates users in establishing trust toward the agents and streamlines the fulfillment of the agents' goals. While recent advancements in Large Language Model (LLM) based agents have improved human behavior simulation, challenges inherent to LLMs (e.g., long context modeling) can undermine their believability. Consequently, evaluating AI agent believability becomes imperative. Unfortunately, prior research often neglects the negative impacts of LLM deficiencies. To address these gaps, we introduce two metrics for assessing LLM-based agent believability: consistency and robustness, together with a benchmark, SimulateBench, to evaluate the consistency and robustness of agents implemented with popular LLMs. We find that agents (i) struggle to accurately depict character information when presented with lengthy profile inputs; (ii) exhibit vulnerability to profile perturbations; and (iii) are significantly affected by certain key factors that impact their overall believability. ## Dataset Details <!-- Provide a longer summary of what this dataset is. --> #### Profile Descriptive Framework & Character Dataset The Profile Descriptive Framework is introduced to document information about a person comprehensively, consisting of three parts: Immutable Characteristic, Social Role, Relationship. We selected characters from TV dramas of popular genres: The Simpsons (Animated), Friends (Comedy), Breaking Bad (Crime), and The Rings of Power(Science fiction). According to the profile descriptive framework, we extract the profile information from the fandom. The profile is recorded in JSON format for easy use. You can find the profile of a character in the folder of "/profile/". The Social Role, Relationship information are stored in one JSON file. For example, if you want to load the profile of character of homer, his profile file is stored in Immutable Chaacteristic: `/profile/homer/profile_v1/basic_information.json` Social Role, Relationship: `/profile/homer/profile_v1/roles.json` #### Consistency Dataset & Robustness Dataset The two dataset is proposed to test the Consistency and robustness performance of agents when prompted with the profile of a character to simulate the character. The two datasets are composed of single-choice questions and their gold answer. According to the profile descriptive framework, there are three kinds of questions related to Immutable Characteristics, Social Roles, and Relationships. For a character, you can find the dataset in the folder of "/benchmark_only_QA". For example, if you want to test the agent when simulating the character of Homer, his dataset is stored in: Immutable Characteristic: `/benchmark_only_QA/basic_informationhomer/homer/questions.json` Social Role: `/benchmark_only_QA/role_non_relation/homer/questions.json` Relationship: `/benchmark_only_QA/role_relation/homer/questions.json` > To test the agent's consistency ability, we will ask the agent to first simulate the character. Then, we will ask the agent to finsh the corresponding single-choice question in the Consistency Dataset. The accuracy score will be used as a measure of the consistency ability. > The Robustness Dataset is these datasets whose names are in the format of 'homer_{varients}'. To test the agent's robustness ability, the agent is tested by comparing their performance on the Consistency dataset and Robustness dataset. For example, if we want to test the agent's robustness ability when faced with age perturbations, we will first change the field of the birthday year of the homer in the profile, namely from 1956 to 1985. We then ask the agent to simulate homer('/profile/homer/'') and homer_1985('/profile/homer_1985/'') by prompting the two profile to the agent respectively. Then, we will ask the agent to finish the test in the '/benchmark_only_QA/{question_type}/homer/questions.json' and '/benchmark_only_QA/{question_type}/homer_1985/questions.json' respectively. Then, we can compare the two score on the two dataset to analyse the agent's robustness ability. <!-- - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed]--> ### Dataset Sources <!-- Provide the basic links for the dataset. --> - **Repository:** [SimulateBench](https://github.com/GAIR-NLP/SimulateBench) - **Paper:** [How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation](https://arxiv.org/abs/2312.17115) <!--## Uses--> <!-- Address questions around how the dataset is intended to be used. --> <!--### Direct Use--> <!-- This section describes suitable use cases for the dataset. --> <!--[More Information Needed]--> <!--### Out-of-Scope Use--> <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> <!--[More Information Needed] ## Dataset Structure--> <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> <!--[More Information Needed] ## Dataset Creation ### Curation Rationale--> <!-- Motivation for the creation of this dataset. --> <!--[More Information Needed] ### Source Data--> <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> <!--#### Data Collection and Processing--> <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> <!--[More Information Needed] #### Who are the source data producers?--> <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> <!--[More Information Needed] ### Annotations [optional]--> <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> <!--#### Annotation process--> <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> <!--[More Information Needed] #### Who are the annotators?--> <!-- This section describes the people or systems who created the annotations. --> <!--[More Information Needed] #### Personal and Sensitive Information--> <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> <!--[More Information Needed] ## Bias, Risks, and Limitations--> <!-- This section is meant to convey both technical and sociotechnical limitations. --> <!--[More Information Needed]--> ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> <!--**BibTeX:**--> @misc{xiao2023far, title={How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation}, author={Yang Xiao and Yi Cheng and Jinlan Fu and Jiashuo Wang and Wenjie Li and Pengfei Liu}, year={2023}, eprint={2312.17115}, archivePrefix={arXiv}, primaryClass={cs.CL} } <!--**APA:** [More Information Needed] ## Glossary [optional]--> <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> <!--[More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed]--> <!--## Dataset Card Contact [More Information Needed]-->
YangXiao-nlp/SimulateBench
[ "task_categories:text-generation", "task_categories:question-answering", "language:en", "license:apache-2.0", "arxiv:2312.17115", "region:us" ]
2024-01-21T13:20:07+00:00
{"language": ["en"], "license": "apache-2.0", "task_categories": ["text-generation", "question-answering"]}
2024-01-21T14:16:58+00:00
[ "2312.17115" ]
[ "en" ]
TAGS #task_categories-text-generation #task_categories-question-answering #language-English #license-apache-2.0 #arxiv-2312.17115 #region-us
# SimulateBench: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation. Human behavior simulation of AI agents necessitates that the agents possess a quality of believability, which is crucial as it facilitates users in establishing trust toward the agents and streamlines the fulfillment of the agents' goals. While recent advancements in Large Language Model (LLM) based agents have improved human behavior simulation, challenges inherent to LLMs (e.g., long context modeling) can undermine their believability. Consequently, evaluating AI agent believability becomes imperative. Unfortunately, prior research often neglects the negative impacts of LLM deficiencies. To address these gaps, we introduce two metrics for assessing LLM-based agent believability: consistency and robustness, together with a benchmark, SimulateBench, to evaluate the consistency and robustness of agents implemented with popular LLMs. We find that agents (i) struggle to accurately depict character information when presented with lengthy profile inputs; (ii) exhibit vulnerability to profile perturbations; and (iii) are significantly affected by certain key factors that impact their overall believability. ## Dataset Details #### Profile Descriptive Framework & Character Dataset The Profile Descriptive Framework is introduced to document information about a person comprehensively, consisting of three parts: Immutable Characteristic, Social Role, Relationship. We selected characters from TV dramas of popular genres: The Simpsons (Animated), Friends (Comedy), Breaking Bad (Crime), and The Rings of Power(Science fiction). According to the profile descriptive framework, we extract the profile information from the fandom. The profile is recorded in JSON format for easy use. You can find the profile of a character in the folder of "/profile/". The Social Role, Relationship information are stored in one JSON file. For example, if you want to load the profile of character of homer, his profile file is stored in Immutable Chaacteristic: '/profile/homer/profile_v1/basic_information.json' Social Role, Relationship: '/profile/homer/profile_v1/URL' #### Consistency Dataset & Robustness Dataset The two dataset is proposed to test the Consistency and robustness performance of agents when prompted with the profile of a character to simulate the character. The two datasets are composed of single-choice questions and their gold answer. According to the profile descriptive framework, there are three kinds of questions related to Immutable Characteristics, Social Roles, and Relationships. For a character, you can find the dataset in the folder of "/benchmark_only_QA". For example, if you want to test the agent when simulating the character of Homer, his dataset is stored in: Immutable Characteristic: '/benchmark_only_QA/basic_informationhomer/homer/URL' Social Role: '/benchmark_only_QA/role_non_relation/homer/URL' Relationship: '/benchmark_only_QA/role_relation/homer/URL' > To test the agent's consistency ability, we will ask the agent to first simulate the character. Then, we will ask the agent to finsh the corresponding single-choice question in the Consistency Dataset. The accuracy score will be used as a measure of the consistency ability. > The Robustness Dataset is these datasets whose names are in the format of 'homer_{varients}'. To test the agent's robustness ability, the agent is tested by comparing their performance on the Consistency dataset and Robustness dataset. For example, if we want to test the agent's robustness ability when faced with age perturbations, we will first change the field of the birthday year of the homer in the profile, namely from 1956 to 1985. We then ask the agent to simulate homer('/profile/homer/'') and homer_1985('/profile/homer_1985/'') by prompting the two profile to the agent respectively. Then, we will ask the agent to finish the test in the '/benchmark_only_QA/{question_type}/homer/URL' and '/benchmark_only_QA/{question_type}/homer_1985/URL' respectively. Then, we can compare the two score on the two dataset to analyse the agent's robustness ability. ### Dataset Sources - Repository: SimulateBench - Paper: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. @misc{xiao2023far, title={How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation}, author={Yang Xiao and Yi Cheng and Jinlan Fu and Jiashuo Wang and Wenjie Li and Pengfei Liu}, year={2023}, eprint={2312.17115}, archivePrefix={arXiv}, primaryClass={cs.CL} }
[ "# SimulateBench: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation.\n\n\n\nHuman behavior simulation of AI agents necessitates that the agents possess a quality of believability, which is crucial as it facilitates users in establishing trust toward the agents and streamlines the fulfillment of the agents' goals. While recent advancements in Large Language Model (LLM) based agents have improved human behavior simulation, challenges inherent to LLMs (e.g., long context modeling) can undermine their believability. Consequently, evaluating AI agent believability becomes imperative. Unfortunately, prior research often neglects the negative impacts of LLM deficiencies. To address these gaps, we introduce two metrics for assessing LLM-based agent believability: consistency and robustness, together with a benchmark, SimulateBench, to evaluate the consistency and robustness of agents implemented with popular LLMs. We find that agents (i) struggle to accurately depict character information when presented with lengthy profile inputs; (ii) exhibit vulnerability to profile perturbations; and (iii) are significantly affected by certain key factors that impact their overall believability.", "## Dataset Details", "#### Profile Descriptive Framework & Character Dataset\n\nThe Profile Descriptive Framework is introduced to document information about a person comprehensively, consisting of three parts: Immutable Characteristic, Social Role, Relationship. We selected characters from TV dramas of popular genres: The Simpsons (Animated), Friends (Comedy), Breaking Bad (Crime), and The Rings of Power(Science fiction). According to the profile descriptive framework, we extract the profile information from the fandom.\n\nThe profile is recorded in JSON format for easy use. You can find the profile of a character in the folder of \"/profile/\". The Social Role, Relationship information are stored in one JSON file.\n\nFor example, if you want to load the profile of character of homer, his profile file is stored in\n\nImmutable Chaacteristic: '/profile/homer/profile_v1/basic_information.json'\n\nSocial Role, Relationship: '/profile/homer/profile_v1/URL'", "#### Consistency Dataset & Robustness Dataset\n\nThe two dataset is proposed to test the Consistency and robustness performance of agents when prompted with the profile of a character to simulate the character. The two datasets are composed of single-choice questions and their gold answer. According to the profile descriptive framework, there are three kinds of questions related to Immutable Characteristics, Social Roles, and Relationships. For a character, you can find the dataset in the folder of \"/benchmark_only_QA\".\n\nFor example, if you want to test the agent when simulating the character of Homer, his dataset is stored in:\n\nImmutable Characteristic: '/benchmark_only_QA/basic_informationhomer/homer/URL'\n\nSocial Role: '/benchmark_only_QA/role_non_relation/homer/URL'\n\nRelationship: '/benchmark_only_QA/role_relation/homer/URL'\n\n> To test the agent's consistency ability, we will ask the agent to first simulate the character. Then, we will ask the agent to finsh the corresponding single-choice question in the Consistency Dataset. The accuracy score will be used as a measure of the consistency ability.\n> The Robustness Dataset is these datasets whose names are in the format of 'homer_{varients}'. To test the agent's robustness ability, the agent is tested by comparing their performance on the Consistency dataset and Robustness dataset. For example, if we want to test the agent's robustness ability when faced with age perturbations, we will first change the field of the birthday year of the homer in the profile, namely from 1956 to 1985. We then ask the agent to simulate homer('/profile/homer/'') and homer_1985('/profile/homer_1985/'') by prompting the two profile to the agent respectively. Then, we will ask the agent to finish the test in the '/benchmark_only_QA/{question_type}/homer/URL' and '/benchmark_only_QA/{question_type}/homer_1985/URL' respectively. Then, we can compare the two score on the two dataset to analyse the agent's robustness ability.", "### Dataset Sources\n\n\n\n- Repository: SimulateBench\n- Paper: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n@misc{xiao2023far,\n title={How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation}, \n author={Yang Xiao and Yi Cheng and Jinlan Fu and Jiashuo Wang and Wenjie Li and Pengfei Liu},\n year={2023},\n eprint={2312.17115},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}" ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #language-English #license-apache-2.0 #arxiv-2312.17115 #region-us \n", "# SimulateBench: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation.\n\n\n\nHuman behavior simulation of AI agents necessitates that the agents possess a quality of believability, which is crucial as it facilitates users in establishing trust toward the agents and streamlines the fulfillment of the agents' goals. While recent advancements in Large Language Model (LLM) based agents have improved human behavior simulation, challenges inherent to LLMs (e.g., long context modeling) can undermine their believability. Consequently, evaluating AI agent believability becomes imperative. Unfortunately, prior research often neglects the negative impacts of LLM deficiencies. To address these gaps, we introduce two metrics for assessing LLM-based agent believability: consistency and robustness, together with a benchmark, SimulateBench, to evaluate the consistency and robustness of agents implemented with popular LLMs. We find that agents (i) struggle to accurately depict character information when presented with lengthy profile inputs; (ii) exhibit vulnerability to profile perturbations; and (iii) are significantly affected by certain key factors that impact their overall believability.", "## Dataset Details", "#### Profile Descriptive Framework & Character Dataset\n\nThe Profile Descriptive Framework is introduced to document information about a person comprehensively, consisting of three parts: Immutable Characteristic, Social Role, Relationship. We selected characters from TV dramas of popular genres: The Simpsons (Animated), Friends (Comedy), Breaking Bad (Crime), and The Rings of Power(Science fiction). According to the profile descriptive framework, we extract the profile information from the fandom.\n\nThe profile is recorded in JSON format for easy use. You can find the profile of a character in the folder of \"/profile/\". The Social Role, Relationship information are stored in one JSON file.\n\nFor example, if you want to load the profile of character of homer, his profile file is stored in\n\nImmutable Chaacteristic: '/profile/homer/profile_v1/basic_information.json'\n\nSocial Role, Relationship: '/profile/homer/profile_v1/URL'", "#### Consistency Dataset & Robustness Dataset\n\nThe two dataset is proposed to test the Consistency and robustness performance of agents when prompted with the profile of a character to simulate the character. The two datasets are composed of single-choice questions and their gold answer. According to the profile descriptive framework, there are three kinds of questions related to Immutable Characteristics, Social Roles, and Relationships. For a character, you can find the dataset in the folder of \"/benchmark_only_QA\".\n\nFor example, if you want to test the agent when simulating the character of Homer, his dataset is stored in:\n\nImmutable Characteristic: '/benchmark_only_QA/basic_informationhomer/homer/URL'\n\nSocial Role: '/benchmark_only_QA/role_non_relation/homer/URL'\n\nRelationship: '/benchmark_only_QA/role_relation/homer/URL'\n\n> To test the agent's consistency ability, we will ask the agent to first simulate the character. Then, we will ask the agent to finsh the corresponding single-choice question in the Consistency Dataset. The accuracy score will be used as a measure of the consistency ability.\n> The Robustness Dataset is these datasets whose names are in the format of 'homer_{varients}'. To test the agent's robustness ability, the agent is tested by comparing their performance on the Consistency dataset and Robustness dataset. For example, if we want to test the agent's robustness ability when faced with age perturbations, we will first change the field of the birthday year of the homer in the profile, namely from 1956 to 1985. We then ask the agent to simulate homer('/profile/homer/'') and homer_1985('/profile/homer_1985/'') by prompting the two profile to the agent respectively. Then, we will ask the agent to finish the test in the '/benchmark_only_QA/{question_type}/homer/URL' and '/benchmark_only_QA/{question_type}/homer_1985/URL' respectively. Then, we can compare the two score on the two dataset to analyse the agent's robustness ability.", "### Dataset Sources\n\n\n\n- Repository: SimulateBench\n- Paper: How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n@misc{xiao2023far,\n title={How Far Are We from Believable AI Agents? A Framework for Evaluating the Believability of Human Behavior Simulation}, \n author={Yang Xiao and Yi Cheng and Jinlan Fu and Jiashuo Wang and Wenjie Li and Pengfei Liu},\n year={2023},\n eprint={2312.17115},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}" ]
da9291e1e19e9bcf0d7bce7d7421fe235d58a8db
# Dataset of ferry/フェリ (Granblue Fantasy) This is the dataset of ferry/フェリ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `animal_ears, blue_hair, long_hair, breasts, wavy_hair, earrings, yellow_eyes, rabbit_ears, bangs, medium_breasts, single_earring, hair_between_eyes, brown_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 769.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 429.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1242 | 924.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 675.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1242 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/ferry_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, erune, looking_at_viewer, sideboob, simple_background, solo, armpits, black_gloves, elbow_gloves, arms_up, bare_shoulders, jewelry, upper_body, white_background, x_hair_ornament, black_dress, covered_navel, backless_outfit, closed_mouth | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_dress, black_gloves, erune, looking_at_viewer, sideboob, solo, x_hair_ornament, elbow_gloves, holding_whip, blush, cape, jewelry, armpits, bare_shoulders, black_thighhighs, closed_mouth, simple_background, smile, very_long_hair, arm_up, sleeveless_dress, weapon, white_background | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, backless_outfit, bare_shoulders, black_gloves, blue_skirt, dress, erune, looking_at_viewer, open_mouth, sideboob, solo, frills, holding_whip, jewelry, sleeveless, brown_thighhighs, ghost, armpits, brown_gloves | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, black_gloves, boots, brown_thighhighs, erune, holding, looking_at_viewer, sideboob, solo, black_footwear, blue_skirt, dress, open_mouth, simple_background, whip, white_background, backless_outfit, full_body, jewelry, black_thighhighs | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_back, bare_shoulders, erune, looking_at_viewer, looking_back, sideboob, solo, black_gloves, from_behind, backless_dress, blue_skirt, jewelry, thighhighs, weapon, arm_up, blush, ghost, holding_whip, open_mouth, small_breasts | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, ass, black_gloves, blue_skirt, erune, from_behind, looking_at_viewer, looking_back, sideboob, solo, bare_back, bare_shoulders, black_thighhighs, blush, jewelry, simple_background, backless_dress, black_panties, small_breasts, white_background, armpits, holding_whip, thighs | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, black_gloves, detached_sleeves, erune, looking_at_viewer, red_dress, solo, underboob, official_alternate_costume, simple_background, white_background, hair_bow, small_breasts, smile, blush, fur-trimmed_sleeves, hand_up, jewelry, long_sleeves, open_mouth, very_long_hair, wide_sleeves | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, erune, holding, open_mouth, solo, bare_shoulders, black_skirt, blush, ghost, hair_flower, high-waist_skirt, looking_at_viewer, sleeveless_shirt, white_shirt, small_breasts, :d, ;d, bare_arms, book, full_body, jewelry, one_eye_closed, petals, simple_background, standing, white_background | | 8 | 28 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, erune, hair_flower, solo, looking_at_viewer, official_alternate_costume, jewelry, navel, blush, ponytail, smile, blue_skirt, cleavage, bare_shoulders, white_bikini, bikini_skirt, hair_ribbon, open_mouth, blue_flower, white_background, simple_background | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, blue_sky, blush, cloud, day, erune, jewelry, looking_at_viewer, solo, armpits, blue_bikini, navel, outdoors, small_breasts, smile, very_long_hair, cleavage, ocean, open_mouth, water, arms_behind_head, arms_up, bare_shoulders, one_eye_closed, stomach, thighs, wading | | 10 | 11 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, 1girl, blush, erune, hetero, penis, solo_focus, nipples, open_mouth, sex, vaginal, nude, small_breasts, sweat, jewelry, cum_in_pussy, navel, spread_legs, thighhighs, bar_censor, black_gloves, uncensored | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | erune | looking_at_viewer | sideboob | simple_background | solo | armpits | black_gloves | elbow_gloves | arms_up | bare_shoulders | jewelry | upper_body | white_background | x_hair_ornament | black_dress | covered_navel | backless_outfit | closed_mouth | holding_whip | cape | black_thighhighs | smile | very_long_hair | arm_up | sleeveless_dress | weapon | blue_skirt | dress | open_mouth | frills | sleeveless | brown_thighhighs | ghost | brown_gloves | boots | holding | black_footwear | whip | full_body | bare_back | looking_back | from_behind | backless_dress | thighhighs | small_breasts | ass | black_panties | thighs | detached_sleeves | red_dress | underboob | official_alternate_costume | hair_bow | fur-trimmed_sleeves | hand_up | long_sleeves | wide_sleeves | black_skirt | hair_flower | high-waist_skirt | sleeveless_shirt | white_shirt | :d | ;d | bare_arms | book | one_eye_closed | petals | standing | navel | ponytail | cleavage | white_bikini | bikini_skirt | hair_ribbon | blue_flower | blue_sky | cloud | day | blue_bikini | outdoors | ocean | water | arms_behind_head | stomach | wading | 1boy | hetero | penis | solo_focus | nipples | sex | vaginal | nude | sweat | cum_in_pussy | spread_legs | bar_censor | uncensored | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------|:--------------------|:-----------|:--------------------|:-------|:----------|:---------------|:---------------|:----------|:-----------------|:----------|:-------------|:-------------------|:------------------|:--------------|:----------------|:------------------|:---------------|:---------------|:-------|:-------------------|:--------|:-----------------|:---------|:-------------------|:---------|:-------------|:--------|:-------------|:---------|:-------------|:-------------------|:--------|:---------------|:--------|:----------|:-----------------|:-------|:------------|:------------|:---------------|:--------------|:-----------------|:-------------|:----------------|:------|:----------------|:---------|:-------------------|:------------|:------------|:-----------------------------|:-----------|:----------------------|:----------|:---------------|:---------------|:--------------|:--------------|:-------------------|:-------------------|:--------------|:-----|:-----|:------------|:-------|:-----------------|:---------|:-----------|:--------|:-----------|:-----------|:---------------|:---------------|:--------------|:--------------|:-----------|:--------|:------|:--------------|:-----------|:--------|:--------|:-------------------|:----------|:---------|:-------|:---------|:--------|:-------------|:----------|:------|:----------|:-------|:--------|:---------------|:--------------|:-------------|:-------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | | X | X | X | | | X | X | | | | | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | X | X | | X | | | X | X | | X | | | | X | | | | X | | | | | | X | X | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | | X | | X | | | X | X | | | | | | | | X | | | | | X | | X | X | | X | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | X | X | X | X | X | | | X | X | | X | | | | | | X | | X | | | | | | X | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | X | X | | X | | | X | X | | X | | | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | | X | X | | | | | X | X | | X | | | | | | | | | | | | | | | | X | | | | X | | | X | | | X | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 28 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | X | | X | X | | | | | X | X | | X | | | | | | | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | | | X | X | | | X | X | X | | | | | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 10 | 11 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | X | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/ferry_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:28:37+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:21:50+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of ferry/フェリ (Granblue Fantasy) ======================================= This is the dataset of ferry/フェリ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'animal\_ears, blue\_hair, long\_hair, breasts, wavy\_hair, earrings, yellow\_eyes, rabbit\_ears, bangs, medium\_breasts, single\_earring, hair\_between\_eyes, brown\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
0990ca011a0cf1cc4f879e7014fd7e240162752d
# Dataset of vira/ヴィーラ (Granblue Fantasy) This is the dataset of vira/ヴィーラ (Granblue Fantasy), containing 27 images and their tags. The core tags of this character are `blonde_hair, long_hair, red_eyes, bow, hair_bow, ponytail, breasts, bangs, hair_between_eyes, hair_ornament, black_bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 27 | 27.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 27 | 20.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 53 | 37.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 27 | 26.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 53 | 45.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/vira_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, hair_flower, looking_at_viewer, obi, open_mouth, solo, blush, floral_print, red_kimono, :d, sidelocks, wide_sleeves, hamaya, holding, long_sleeves, official_alternate_costume, upper_body | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, armor, looking_at_viewer, smile, sword, cleavage, dress, holding_weapon, upper_body | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_flower | looking_at_viewer | obi | open_mouth | solo | blush | floral_print | red_kimono | :d | sidelocks | wide_sleeves | hamaya | holding | long_sleeves | official_alternate_costume | upper_body | armor | smile | sword | cleavage | dress | holding_weapon | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:------|:-------------|:-------|:--------|:---------------|:-------------|:-----|:------------|:---------------|:---------|:----------|:---------------|:-----------------------------|:-------------|:--------|:--------|:--------|:-----------|:--------|:-----------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X |
CyberHarem/vira_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:28:43+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:35:08+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of vira/ヴィーラ (Granblue Fantasy) ======================================= This is the dataset of vira/ヴィーラ (Granblue Fantasy), containing 27 images and their tags. The core tags of this character are 'blonde\_hair, long\_hair, red\_eyes, bow, hair\_bow, ponytail, breasts, bangs, hair\_between\_eyes, hair\_ornament, black\_bow', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
c5320a3321f773d2ee9205546d4105e2a59211e1
# Dataset of anchira/アンチラ (Granblue Fantasy) This is the dataset of anchira/アンチラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are `blonde_hair, animal_ears, short_hair, monkey_ears, monkey_tail, tail, breasts, small_breasts, red_eyes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 698.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 394.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1244 | 869.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 620.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1244 | 1.21 GiB | [Download](https://huggingface.co/datasets/CyberHarem/anchira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/anchira_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, erune, hair_flower, looking_at_viewer, official_alternate_costume, solo, blush, double_bun, smile, covered_navel, highleg_swimsuit, thighs, ahoge, casual_one-piece_swimsuit, detached_sleeves, see-through, blue_one-piece_swimsuit, open_mouth | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, barefoot, detached_sleeves, erune, looking_at_viewer, solo, feet, bare_shoulders, soles, ass, blush, sideboob, toes, staff, cloud, smile | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cloud, detached_sleeves, erune, looking_at_viewer, sideboob, smile, solo, staff, bare_shoulders, brown_eyes, hagoromo | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, detached_sleeves, erune, looking_at_viewer, solo, staff, barefoot, hairband, sideboob, blush, cloud, white_background | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, blush, detached_sleeves, erune, solo, thighhighs, looking_at_viewer, simple_background, leotard, white_background, staff, wide_sleeves, brown_eyes, sideboob, thighs | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, cloud, detached_sleeves, erune, looking_at_viewer, solo, staff, two_side_up, barefoot, blush, cleavage_cutout, holding, sash, sideboob, hagoromo, hairband, simple_background, white_background, wide_sleeves, ahoge, chibi, leotard, sitting, smile | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, ahoge, blue_shorts, blush, cleavage_cutout, erune, fur_trim, hairband, long_sleeves, looking_at_viewer, short_shorts, smile, solo, two_side_up, blue_ribbon, closed_mouth, petals, striped_thighhighs, denim_shorts, midriff, navel, sitting, white_shirt, bell, belt, full_body, hair_between_eyes, jacket, staff, tail_ribbon | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | detached_sleeves, harvin, wide_sleeves, bare_shoulders, looking_at_viewer, 1girl, bandeau, blush, feathers, hair_beads, very_long_hair, :o, black_thighhighs, pelvic_curtain, earrings, hands_on_own_face, midriff, multiple_girls, pointy_ears, solo_focus | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | erune | hair_flower | looking_at_viewer | official_alternate_costume | solo | blush | double_bun | smile | covered_navel | highleg_swimsuit | thighs | ahoge | casual_one-piece_swimsuit | detached_sleeves | see-through | blue_one-piece_swimsuit | open_mouth | barefoot | feet | soles | ass | sideboob | toes | staff | cloud | brown_eyes | hagoromo | hairband | white_background | thighhighs | simple_background | leotard | wide_sleeves | two_side_up | cleavage_cutout | holding | sash | chibi | sitting | blue_shorts | fur_trim | long_sleeves | short_shorts | blue_ribbon | closed_mouth | petals | striped_thighhighs | denim_shorts | midriff | navel | white_shirt | bell | belt | full_body | hair_between_eyes | jacket | tail_ribbon | harvin | bandeau | feathers | hair_beads | very_long_hair | :o | black_thighhighs | pelvic_curtain | earrings | hands_on_own_face | multiple_girls | pointy_ears | solo_focus | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------|:--------------------|:-----------------------------|:-------|:--------|:-------------|:--------|:----------------|:-------------------|:---------|:--------|:----------------------------|:-------------------|:--------------|:--------------------------|:-------------|:-----------|:-------|:--------|:------|:-----------|:-------|:--------|:--------|:-------------|:-----------|:-----------|:-------------------|:-------------|:--------------------|:----------|:---------------|:--------------|:------------------|:----------|:-------|:--------|:----------|:--------------|:-----------|:---------------|:---------------|:--------------|:---------------|:---------|:---------------------|:---------------|:----------|:--------|:--------------|:-------|:-------|:------------|:--------------------|:---------|:--------------|:---------|:----------|:-----------|:-------------|:-----------------|:-----|:-------------------|:-----------------|:-----------|:--------------------|:-----------------|:--------------|:-------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | X | X | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | | X | | | X | | | | | | X | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | X | X | | | | | | | | X | | | | X | | | | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | X | | X | X | | | | | X | | | X | | | | | | | | X | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | X | | X | X | | X | | | | X | | X | | | | X | | | | X | | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | | X | X | | X | | | | X | | | | | | | | | | | | X | | | | X | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/anchira_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:28:53+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:15:21+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of anchira/アンチラ (Granblue Fantasy) ========================================== This is the dataset of anchira/アンチラ (Granblue Fantasy), containing 500 images and their tags. The core tags of this character are 'blonde\_hair, animal\_ears, short\_hair, monkey\_ears, monkey\_tail, tail, breasts, small\_breasts, red\_eyes, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
d17f588d1887891623ed8f8e5545cb0cce28624f
# Dataset of danua/ダヌア (Granblue Fantasy) This is the dataset of danua/ダヌア (Granblue Fantasy), containing 259 images and their tags. The core tags of this character are `horns, long_hair, breasts, pointy_ears, red_eyes, black_hair, large_breasts, antenna_hair, horn_ornament, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 259 | 310.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 259 | 196.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 640 | 424.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 259 | 284.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 640 | 554.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/danua_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, draph, solo, looking_at_viewer, nipples, nude, blush, navel, huge_breasts, bandaged_arm, pussy, simple_background, white_background | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, draph, fingerless_gloves, looking_at_viewer, necklace, solo, bandaged_arm, crescent, simple_background, white_background, blush, nipples, white_dress, blood, cleavage | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage, draph, looking_at_viewer, official_alternate_costume, solo, white_bikini, necklace, bandaged_arm, navel, blush | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, cleavage, draph, looking_at_viewer, necklace, official_alternate_costume, solo, bandaged_arm, doll, navel, purple_hair, white_bikini, blush, front-tie_top, innertube, crescent, water | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bandaged_arm, cleavage, draph, navel, necklace, official_alternate_costume, purple_hair, side-tie_bikini_bottom, solo, white_bikini, looking_at_viewer, simple_background, white_background, crescent, finger_to_mouth, front-tie_bikini_top | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, draph, hetero, nipples, nude, sex, blush, penis, solo_focus, vaginal, bandaged_arm, censored, girl_on_top, cowgirl_position, cum_in_pussy, navel, open_mouth, simple_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | solo | looking_at_viewer | nipples | nude | blush | navel | huge_breasts | bandaged_arm | pussy | simple_background | white_background | black_gloves | fingerless_gloves | necklace | crescent | white_dress | blood | cleavage | official_alternate_costume | white_bikini | doll | purple_hair | front-tie_top | innertube | water | side-tie_bikini_bottom | finger_to_mouth | front-tie_bikini_top | 1boy | hetero | sex | penis | solo_focus | vaginal | censored | girl_on_top | cowgirl_position | cum_in_pussy | open_mouth | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:----------|:-------|:--------|:--------|:---------------|:---------------|:--------|:--------------------|:-------------------|:---------------|:--------------------|:-----------|:-----------|:--------------|:--------|:-----------|:-----------------------------|:---------------|:-------|:--------------|:----------------|:------------|:--------|:-------------------------|:------------------|:-----------------------|:-------|:---------|:------|:--------|:-------------|:----------|:-----------|:--------------|:-------------------|:---------------|:-------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 2 | 12 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | | X | X | | X | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | | X | X | | X | | | | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | | | X | | X | | X | X | | | X | X | | | X | X | X | | X | | | | X | X | X | | | | | | | | | | | | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/danua_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T13:29:09+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T14:15:03+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of danua/ダヌア (Granblue Fantasy) ======================================= This is the dataset of danua/ダヌア (Granblue Fantasy), containing 259 images and their tags. The core tags of this character are 'horns, long\_hair, breasts, pointy\_ears, red\_eyes, black\_hair, large\_breasts, antenna\_hair, horn\_ornament, hair\_between\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
9e1b05e3371597c3f7e9ce0a0d8356962a3ac7ad
# Dataset Card for Evaluation run of macadeliccc/piccolo-8x7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-8x7b](https://huggingface.co/macadeliccc/piccolo-8x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_macadeliccc__piccolo-8x7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T13:50:45.103516](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-8x7b/blob/main/results_2024-01-21T13-50-45.103516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6465546091601527, "acc_stderr": 0.03216541518825926, "acc_norm": 0.6461126157257634, "acc_norm_stderr": 0.03282893283116576, "mc1": 0.48225214198286415, "mc1_stderr": 0.01749247084307536, "mc2": 0.6416518638228494, "mc2_stderr": 0.015589064706165267 }, "harness|arc:challenge|25": { "acc": 0.6672354948805461, "acc_stderr": 0.013769863046192302, "acc_norm": 0.6962457337883959, "acc_norm_stderr": 0.013438909184778762 }, "harness|hellaswag|10": { "acc": 0.6990639314877515, "acc_stderr": 0.00457727584443245, "acc_norm": 0.8698466440948018, "acc_norm_stderr": 0.003357844249123955 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700918, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.0368122963339432, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.0368122963339432 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.049598599663841815, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.049598599663841815 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.032400380867927465, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.032400380867927465 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055266, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055266 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.024121125416941187, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.024121125416941187 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461777, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461777 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8529411764705882, "acc_stderr": 0.02485747808025046, "acc_norm": 0.8529411764705882, "acc_norm_stderr": 0.02485747808025046 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601446, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822914, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822914 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993452, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993452 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39888268156424583, "acc_stderr": 0.016376966142610073, "acc_norm": 0.39888268156424583, "acc_norm_stderr": 0.016376966142610073 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818767, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818767 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765137, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.01274085387294983, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.01274085387294983 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.02873932851398357, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.02873932851398357 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507205, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507205 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.48225214198286415, "mc1_stderr": 0.01749247084307536, "mc2": 0.6416518638228494, "mc2_stderr": 0.015589064706165267 }, "harness|winogrande|5": { "acc": 0.7987371744277821, "acc_stderr": 0.01126851997157768 }, "harness|gsm8k|5": { "acc": 0.7202426080363912, "acc_stderr": 0.012364384016735319 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_macadeliccc__piccolo-8x7b
[ "region:us" ]
2024-01-21T13:53:07+00:00
{"pretty_name": "Evaluation run of macadeliccc/piccolo-8x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-8x7b](https://huggingface.co/macadeliccc/piccolo-8x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__piccolo-8x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T13:50:45.103516](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-8x7b/blob/main/results_2024-01-21T13-50-45.103516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6465546091601527,\n \"acc_stderr\": 0.03216541518825926,\n \"acc_norm\": 0.6461126157257634,\n \"acc_norm_stderr\": 0.03282893283116576,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6416518638228494,\n \"mc2_stderr\": 0.015589064706165267\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192302,\n \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778762\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6990639314877515,\n \"acc_stderr\": 0.00457727584443245,\n \"acc_norm\": 0.8698466440948018,\n \"acc_norm_stderr\": 0.003357844249123955\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461777,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461777\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6416518638228494,\n \"mc2_stderr\": 0.015589064706165267\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7202426080363912,\n \"acc_stderr\": 0.012364384016735319\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/piccolo-8x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|arc:challenge|25_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|gsm8k|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hellaswag|10_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["**/details_harness|winogrande|5_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T13-50-45.103516.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T13_50_45.103516", "path": ["results_2024-01-21T13-50-45.103516.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T13-50-45.103516.parquet"]}]}]}
2024-01-21T13:53:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of macadeliccc/piccolo-8x7b Dataset automatically created during the evaluation run of model macadeliccc/piccolo-8x7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T13:50:45.103516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of macadeliccc/piccolo-8x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/piccolo-8x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T13:50:45.103516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of macadeliccc/piccolo-8x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/piccolo-8x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T13:50:45.103516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1c02912bcc8dfb56a2023127ca5d159ce330830b
# Dataset Card for Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered ## Creation process - Load data from https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized/viewer/default/train_sft - Do zero shot classification with facebook/bart-large-mnli in this kind of way (Actual implementation might be slightly different): ```python preds = pipe(f'{row["instruction"]} is a question about:', candidate_labels=["USA related question", "Math related question", "General question", "Coding related question"]) ``` - Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"] - Write rows to .txt file with *** on a newline separating instruction/accepted_response/rejected_response and then END on a newline separating samples - Upload file to deepl.com for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity
Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered
[ "task_categories:text-generation", "language:fi", "license:mit", "region:us" ]
2024-01-21T14:01:48+00:00
{"language": ["fi"], "license": "mit", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response_accepted", "dtype": "string"}, {"name": "response_rejected", "dtype": "string"}, {"name": "instruction_perplexity_kenlm", "dtype": "int64"}, {"name": "chosen_response_perplexity_kenlm", "dtype": "int64"}, {"name": "rejected_response_perplexity_kenlm", "dtype": "int64"}, {"name": "combined_perplexity_dpo", "dtype": "int64"}, {"name": "combined_perplexity_sft", "dtype": "int64"}, {"name": "instruction_lang", "dtype": "string"}, {"name": "instruction_lang_proba", "dtype": "float64"}, {"name": "chosen_response_lang", "dtype": "string"}, {"name": "chosen_response_lang_proba", "dtype": "float64"}, {"name": "rejected_response_lang", "dtype": "string"}, {"name": "rejected_response_lang_proba", "dtype": "float64"}, {"name": "perplexity_instruction_len_ratio", "dtype": "float64"}, {"name": "perplexity_response_len_ratio", "dtype": "float64"}, {"name": "dataset_source", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 72293514, "num_examples": 12421}], "download_size": 41218224, "dataset_size": 72293514}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-13T21:33:03+00:00
[]
[ "fi" ]
TAGS #task_categories-text-generation #language-Finnish #license-mit #region-us
# Dataset Card for Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered ## Creation process - Load data from URL - Do zero shot classification with facebook/bart-large-mnli in this kind of way (Actual implementation might be slightly different): - Filter out rows with too high scores in following categories ["USA related question", "Math related question","Coding related question"] - Write rows to .txt file with * on a newline separating instruction/accepted_response/rejected_response and then END on a newline separating samples - Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity
[ "# Dataset Card for Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered", "## Creation process\n - Load data from URL\n - Do zero shot classification with facebook/bart-large-mnli in this kind of way (Actual implementation might be slightly different):\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/accepted_response/rejected_response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity" ]
[ "TAGS\n#task_categories-text-generation #language-Finnish #license-mit #region-us \n", "# Dataset Card for Finnish-NLP/ultrafeedback_deepl_sft_dpo_filtered", "## Creation process\n - Load data from URL\n - Do zero shot classification with facebook/bart-large-mnli in this kind of way (Actual implementation might be slightly different):\n\n- Filter out rows with too high scores in following categories [\"USA related question\", \"Math related question\",\"Coding related question\"]\n- Write rows to .txt file with * on a newline separating instruction/accepted_response/rejected_response and then END on a newline separating samples\n- Upload file to URL for file translation --> parse samples back from translated files --> Maybe some additional cleaning/filtering based on fasttext langdetect / kenlm perplexity" ]
8c8cf2668eddf3131da5fa4288dbff6493f7037f
# Dataset Card for Evaluation run of Steelskull/Aurora_base_test <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Steelskull/Aurora_base_test](https://huggingface.co/Steelskull/Aurora_base_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Steelskull__Aurora_base_test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-21T14:10:28.609924](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aurora_base_test/blob/main/results_2024-01-21T14-10-28.609924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.601064189469788, "acc_stderr": 0.033290989395204654, "acc_norm": 0.606869148379456, "acc_norm_stderr": 0.03397539939220347, "mc1": 0.5312117503059975, "mc1_stderr": 0.01746936487457753, "mc2": 0.6783801405582048, "mc2_stderr": 0.015256900573808395 }, "harness|arc:challenge|25": { "acc": 0.5870307167235495, "acc_stderr": 0.014388344935398326, "acc_norm": 0.628839590443686, "acc_norm_stderr": 0.014117971901142825 }, "harness|hellaswag|10": { "acc": 0.6520613423620792, "acc_stderr": 0.00475342980664544, "acc_norm": 0.8398725353515236, "acc_norm_stderr": 0.0036597474762410597 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.02914690474779833, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.02914690474779833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.46078431372549017, "acc_stderr": 0.049598599663841815, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.049598599663841815 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.04615186962583703, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.04615186962583703 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155254, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155254 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6419354838709678, "acc_stderr": 0.027273890594300645, "acc_norm": 0.6419354838709678, "acc_norm_stderr": 0.027273890594300645 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.03510766597959217, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124484, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124484 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397443, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5717948717948718, "acc_stderr": 0.025088301454694834, "acc_norm": 0.5717948717948718, "acc_norm_stderr": 0.025088301454694834 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968351, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968351 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.01738141556360868, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.01738141556360868 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854053, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854053 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02977177522814563, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02977177522814563 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.600896860986547, "acc_stderr": 0.03286745312567961, "acc_norm": 0.600896860986547, "acc_norm_stderr": 0.03286745312567961 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.03880848301082393, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.03880848301082393 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7739463601532567, "acc_stderr": 0.014957458504335839, "acc_norm": 0.7739463601532567, "acc_norm_stderr": 0.014957458504335839 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6965317919075145, "acc_stderr": 0.024752411960917205, "acc_norm": 0.6965317919075145, "acc_norm_stderr": 0.024752411960917205 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3240223463687151, "acc_stderr": 0.015652542496421125, "acc_norm": 0.3240223463687151, "acc_norm_stderr": 0.015652542496421125 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.673202614379085, "acc_stderr": 0.02685729466328141, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.02685729466328141 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818777, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818777 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6820987654320988, "acc_stderr": 0.02591006352824088, "acc_norm": 0.6820987654320988, "acc_norm_stderr": 0.02591006352824088 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.029752389657427047, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.029752389657427047 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4211212516297262, "acc_stderr": 0.012610325733489906, "acc_norm": 0.4211212516297262, "acc_norm_stderr": 0.012610325733489906 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.029349803139765873, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.029349803139765873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6127450980392157, "acc_stderr": 0.019706875804085637, "acc_norm": 0.6127450980392157, "acc_norm_stderr": 0.019706875804085637 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6938775510204082, "acc_stderr": 0.02950489645459596, "acc_norm": 0.6938775510204082, "acc_norm_stderr": 0.02950489645459596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.03115715086935557, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.03115715086935557 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5312117503059975, "mc1_stderr": 0.01746936487457753, "mc2": 0.6783801405582048, "mc2_stderr": 0.015256900573808395 }, "harness|winogrande|5": { "acc": 0.7640094711917916, "acc_stderr": 0.011933828850275625 }, "harness|gsm8k|5": { "acc": 0.3252463987869598, "acc_stderr": 0.012903904752543927 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Steelskull__Aurora_base_test
[ "region:us" ]
2024-01-21T14:12:46+00:00
{"pretty_name": "Evaluation run of Steelskull/Aurora_base_test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Steelskull/Aurora_base_test](https://huggingface.co/Steelskull/Aurora_base_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Aurora_base_test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-21T14:10:28.609924](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aurora_base_test/blob/main/results_2024-01-21T14-10-28.609924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.601064189469788,\n \"acc_stderr\": 0.033290989395204654,\n \"acc_norm\": 0.606869148379456,\n \"acc_norm_stderr\": 0.03397539939220347,\n \"mc1\": 0.5312117503059975,\n \"mc1_stderr\": 0.01746936487457753,\n \"mc2\": 0.6783801405582048,\n \"mc2_stderr\": 0.015256900573808395\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142825\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n \"acc_stderr\": 0.00475342980664544,\n \"acc_norm\": 0.8398725353515236,\n \"acc_norm_stderr\": 0.0036597474762410597\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968351,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968351\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335839,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335839\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n \"acc_stderr\": 0.015652542496421125,\n \"acc_norm\": 0.3240223463687151,\n \"acc_norm_stderr\": 0.015652542496421125\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.02685729466328141,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.02685729466328141\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489906,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489906\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5312117503059975,\n \"mc1_stderr\": 0.01746936487457753,\n \"mc2\": 0.6783801405582048,\n \"mc2_stderr\": 0.015256900573808395\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3252463987869598,\n \"acc_stderr\": 0.012903904752543927\n }\n}\n```", "repo_url": "https://huggingface.co/Steelskull/Aurora_base_test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|arc:challenge|25_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|gsm8k|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hellaswag|10_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["**/details_harness|winogrande|5_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-21T14-10-28.609924.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_21T14_10_28.609924", "path": ["results_2024-01-21T14-10-28.609924.parquet"]}, {"split": "latest", "path": ["results_2024-01-21T14-10-28.609924.parquet"]}]}]}
2024-01-21T14:13:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Steelskull/Aurora_base_test Dataset automatically created during the evaluation run of model Steelskull/Aurora_base_test on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-21T14:10:28.609924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Steelskull/Aurora_base_test\n\n\n\nDataset automatically created during the evaluation run of model Steelskull/Aurora_base_test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T14:10:28.609924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Steelskull/Aurora_base_test\n\n\n\nDataset automatically created during the evaluation run of model Steelskull/Aurora_base_test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-21T14:10:28.609924(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
41a01d74a93a6f67a2686ffc856796328d3e3803
## Novelai3 Images The Novelai3 text-to-image distillation dataset contains over 30GB of anime-related (text, image) pairs, intended solely for educational and research purposes! It must not be used for any illicit activities. ### Production Method The dataset was created through automated browser operations, repeatedly clicking the "generate image" button and saving the resulting images. Over the course of a month, approximately 38GB of (image, text instruction) pairs were collected. It has not been finely filtered by humans, but there are plans to select and refine it when time and energy permit (a reduced version of the dataset, novelai3-filtered, will be released separately). ### Use & Citation Feel free to train an open-source version of the open-novelai3 image generation model on top of existing anime models, but please remember to cite our dataset work link when using it for training (this data collection effort was quite laborious in terms of time and manpower 💦💦). We hope to contribute to the better development of open-source artificial intelligence! ### Some Training Suggestions 0. It is not recommended to train with the entire dataset all at once. 1. It is suggested to adjust the proportions and repetition frequencies of different subcategories within the dataset according to the style of the model you wish to learn (this can be done by directly adding or deleting data). 2. You can check if the current prompt is what you want, and if not, write Python scripts to perform batch replacements (you can even use models like GPT4-V, Qwen-VL, BLIP2, Deepbooru, etc., to replace the current tags as needed). 3. For the categories you're particularly interested in, you can manually review the specific image content and actively delete some of the poorly generated samples before training (this is akin to a human preference selection, which will improve the final quality of the model). ## Download Method Aistudio https://aistudio.baidu.com/datasetdetail/257868 huggingface https://huggingface.co/datasets/shareAI/novelai3 Tip: If you find some categories you're interested in are not currently included, feel free to make suggestions and we will consider expanding the dataset. ## Novelai3 Images novelai3的文本生成图片蒸馏数据集 ,包含30余G二次元动漫方面的(文本,图像)对,仅作为学习和研究使用!不得用于违规用途。 ### 制作方式 通过自动化浏览器操作,不断点击生图按钮和保存生成的图像,最后收集得到的一个数据集,时间为期一个月,大概获得了38个G的(图像,文本指令)对。 未经过人为精细筛选,计划之后有时间精力再进行筛选(会单独再放出一个novelai3-filtered的缩小版数据集) ### 使用 & 引用 欢迎拿去在已有的动漫模型基础上训练开源版本的open-novelai3图像生成模型哈,但请使用训练时不要忘了引用我们的数据集工作链接(本次数据收集工作在时间和人力上都是很辛苦的💦💦) 希望能够为人工智能开源更好的发展贡献一份绵薄之力~ ### 一点训练建议 0、不建议直接使用全部数据一股脑进行训练。 1、建议根据自己想要学出的模型风格,按需调整数据集中各个子类别的占比、重复次数等(可以通过直接增删数据) 2、可以查看当前的prompt是否是你想要的,如果不是可以编写python脚本规则进行批量替换(甚至可以调用GPT4-V、Qwen-VL、BLIP2、Deepbooru等模型按需替换掉当前标签) 3、针对你重点关注的那些类别,可以手动查看具体图像内容,主动删一些生成效果不好的学习样本再训练 (相当于人类偏好选择,会提升模型最终的质量表现) ## 下载方式 aistudio https://aistudio.baidu.com/datasetdetail/257868 huggingface https://huggingface.co/datasets/shareAI/novelai3 tip: 如果你发现有哪些想要的类别当前没有,欢迎提出建议我们会再进行数据扩充。
shareAI/novelai3
[ "task_categories:text-to-image", "size_categories:n<1K", "language:en", "license:apache-2.0", "image generation", "novelai", "nai", "text2image", "prompt", "images", "stable-diffusion", "stable-diffusion-xl", "art", "region:us" ]
2024-01-21T14:13:54+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "pretty_name": "novelai3 images", "tags": ["image generation", "novelai", "nai", "text2image", "prompt", "images", "stable-diffusion", "stable-diffusion-xl", "art"], "viewer": true}
2024-02-04T11:59:09+00:00
[]
[ "en" ]
TAGS #task_categories-text-to-image #size_categories-n<1K #language-English #license-apache-2.0 #image generation #novelai #nai #text2image #prompt #images #stable-diffusion #stable-diffusion-xl #art #region-us
## Novelai3 Images The Novelai3 text-to-image distillation dataset contains over 30GB of anime-related (text, image) pairs, intended solely for educational and research purposes! It must not be used for any illicit activities. ### Production Method The dataset was created through automated browser operations, repeatedly clicking the "generate image" button and saving the resulting images. Over the course of a month, approximately 38GB of (image, text instruction) pairs were collected. It has not been finely filtered by humans, but there are plans to select and refine it when time and energy permit (a reduced version of the dataset, novelai3-filtered, will be released separately). ### Use & Citation Feel free to train an open-source version of the open-novelai3 image generation model on top of existing anime models, but please remember to cite our dataset work link when using it for training (this data collection effort was quite laborious in terms of time and manpower ). We hope to contribute to the better development of open-source artificial intelligence! ### Some Training Suggestions 0. It is not recommended to train with the entire dataset all at once. 1. It is suggested to adjust the proportions and repetition frequencies of different subcategories within the dataset according to the style of the model you wish to learn (this can be done by directly adding or deleting data). 2. You can check if the current prompt is what you want, and if not, write Python scripts to perform batch replacements (you can even use models like GPT4-V, Qwen-VL, BLIP2, Deepbooru, etc., to replace the current tags as needed). 3. For the categories you're particularly interested in, you can manually review the specific image content and actively delete some of the poorly generated samples before training (this is akin to a human preference selection, which will improve the final quality of the model). ## Download Method Aistudio URL huggingface URL Tip: If you find some categories you're interested in are not currently included, feel free to make suggestions and we will consider expanding the dataset. ## Novelai3 Images novelai3的文本生成图片蒸馏数据集 ,包含30余G二次元动漫方面的(文本,图像)对,仅作为学习和研究使用!不得用于违规用途。 ### 制作方式 通过自动化浏览器操作,不断点击生图按钮和保存生成的图像,最后收集得到的一个数据集,时间为期一个月,大概获得了38个G的(图像,文本指令)对。 未经过人为精细筛选,计划之后有时间精力再进行筛选(会单独再放出一个novelai3-filtered的缩小版数据集) ### 使用 & 引用 欢迎拿去在已有的动漫模型基础上训练开源版本的open-novelai3图像生成模型哈,但请使用训练时不要忘了引用我们的数据集工作链接(本次数据收集工作在时间和人力上都是很辛苦的) 希望能够为人工智能开源更好的发展贡献一份绵薄之力~ ### 一点训练建议 0、不建议直接使用全部数据一股脑进行训练。 1、建议根据自己想要学出的模型风格,按需调整数据集中各个子类别的占比、重复次数等(可以通过直接增删数据) 2、可以查看当前的prompt是否是你想要的,如果不是可以编写python脚本规则进行批量替换(甚至可以调用GPT4-V、Qwen-VL、BLIP2、Deepbooru等模型按需替换掉当前标签) 3、针对你重点关注的那些类别,可以手动查看具体图像内容,主动删一些生成效果不好的学习样本再训练 (相当于人类偏好选择,会提升模型最终的质量表现) ## 下载方式 aistudio URL huggingface URL tip: 如果你发现有哪些想要的类别当前没有,欢迎提出建议我们会再进行数据扩充。
[ "## Novelai3 Images\nThe Novelai3 text-to-image distillation dataset contains over 30GB of anime-related (text, image) pairs, intended solely for educational and research purposes! It must not be used for any illicit activities.", "### Production Method\nThe dataset was created through automated browser operations, repeatedly clicking the \"generate image\" button and saving the resulting images. Over the course of a month, approximately 38GB of (image, text instruction) pairs were collected. \nIt has not been finely filtered by humans, but there are plans to select and refine it when time and energy permit (a reduced version of the dataset, novelai3-filtered, will be released separately).", "### Use & Citation\nFeel free to train an open-source version of the open-novelai3 image generation model on top of existing anime models, but please remember to cite our dataset work link when using it for training (this data collection effort was quite laborious in terms of time and manpower ).\nWe hope to contribute to the better development of open-source artificial intelligence!", "### Some Training Suggestions\n0. It is not recommended to train with the entire dataset all at once.\n1. It is suggested to adjust the proportions and repetition frequencies of different subcategories within the dataset according to the style of the model you wish to learn (this can be done by directly adding or deleting data).\n2. You can check if the current prompt is what you want, and if not, write Python scripts to perform batch replacements (you can even use models like GPT4-V, Qwen-VL, BLIP2, Deepbooru, etc., to replace the current tags as needed).\n3. For the categories you're particularly interested in, you can manually review the specific image content and actively delete some of the poorly generated samples before training (this is akin to a human preference selection, which will improve the final quality of the model).", "## Download Method\n\nAistudio \nURL \n\nhuggingface \nURL \n\n\nTip: If you find some categories you're interested in are not currently included, feel free to make suggestions and we will consider expanding the dataset.", "## Novelai3 Images\nnovelai3的文本生成图片蒸馏数据集 ,包含30余G二次元动漫方面的(文本,图像)对,仅作为学习和研究使用!不得用于违规用途。", "### 制作方式\n通过自动化浏览器操作,不断点击生图按钮和保存生成的图像,最后收集得到的一个数据集,时间为期一个月,大概获得了38个G的(图像,文本指令)对。 \n未经过人为精细筛选,计划之后有时间精力再进行筛选(会单独再放出一个novelai3-filtered的缩小版数据集)", "### 使用 & 引用\n欢迎拿去在已有的动漫模型基础上训练开源版本的open-novelai3图像生成模型哈,但请使用训练时不要忘了引用我们的数据集工作链接(本次数据收集工作在时间和人力上都是很辛苦的)\n希望能够为人工智能开源更好的发展贡献一份绵薄之力~", "### 一点训练建议\n0、不建议直接使用全部数据一股脑进行训练。 \n1、建议根据自己想要学出的模型风格,按需调整数据集中各个子类别的占比、重复次数等(可以通过直接增删数据) \n2、可以查看当前的prompt是否是你想要的,如果不是可以编写python脚本规则进行批量替换(甚至可以调用GPT4-V、Qwen-VL、BLIP2、Deepbooru等模型按需替换掉当前标签) \n3、针对你重点关注的那些类别,可以手动查看具体图像内容,主动删一些生成效果不好的学习样本再训练 (相当于人类偏好选择,会提升模型最终的质量表现)", "## 下载方式\n\naistudio \nURL \n\nhuggingface \nURL \n\ntip: 如果你发现有哪些想要的类别当前没有,欢迎提出建议我们会再进行数据扩充。" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #language-English #license-apache-2.0 #image generation #novelai #nai #text2image #prompt #images #stable-diffusion #stable-diffusion-xl #art #region-us \n", "## Novelai3 Images\nThe Novelai3 text-to-image distillation dataset contains over 30GB of anime-related (text, image) pairs, intended solely for educational and research purposes! It must not be used for any illicit activities.", "### Production Method\nThe dataset was created through automated browser operations, repeatedly clicking the \"generate image\" button and saving the resulting images. Over the course of a month, approximately 38GB of (image, text instruction) pairs were collected. \nIt has not been finely filtered by humans, but there are plans to select and refine it when time and energy permit (a reduced version of the dataset, novelai3-filtered, will be released separately).", "### Use & Citation\nFeel free to train an open-source version of the open-novelai3 image generation model on top of existing anime models, but please remember to cite our dataset work link when using it for training (this data collection effort was quite laborious in terms of time and manpower ).\nWe hope to contribute to the better development of open-source artificial intelligence!", "### Some Training Suggestions\n0. It is not recommended to train with the entire dataset all at once.\n1. It is suggested to adjust the proportions and repetition frequencies of different subcategories within the dataset according to the style of the model you wish to learn (this can be done by directly adding or deleting data).\n2. You can check if the current prompt is what you want, and if not, write Python scripts to perform batch replacements (you can even use models like GPT4-V, Qwen-VL, BLIP2, Deepbooru, etc., to replace the current tags as needed).\n3. For the categories you're particularly interested in, you can manually review the specific image content and actively delete some of the poorly generated samples before training (this is akin to a human preference selection, which will improve the final quality of the model).", "## Download Method\n\nAistudio \nURL \n\nhuggingface \nURL \n\n\nTip: If you find some categories you're interested in are not currently included, feel free to make suggestions and we will consider expanding the dataset.", "## Novelai3 Images\nnovelai3的文本生成图片蒸馏数据集 ,包含30余G二次元动漫方面的(文本,图像)对,仅作为学习和研究使用!不得用于违规用途。", "### 制作方式\n通过自动化浏览器操作,不断点击生图按钮和保存生成的图像,最后收集得到的一个数据集,时间为期一个月,大概获得了38个G的(图像,文本指令)对。 \n未经过人为精细筛选,计划之后有时间精力再进行筛选(会单独再放出一个novelai3-filtered的缩小版数据集)", "### 使用 & 引用\n欢迎拿去在已有的动漫模型基础上训练开源版本的open-novelai3图像生成模型哈,但请使用训练时不要忘了引用我们的数据集工作链接(本次数据收集工作在时间和人力上都是很辛苦的)\n希望能够为人工智能开源更好的发展贡献一份绵薄之力~", "### 一点训练建议\n0、不建议直接使用全部数据一股脑进行训练。 \n1、建议根据自己想要学出的模型风格,按需调整数据集中各个子类别的占比、重复次数等(可以通过直接增删数据) \n2、可以查看当前的prompt是否是你想要的,如果不是可以编写python脚本规则进行批量替换(甚至可以调用GPT4-V、Qwen-VL、BLIP2、Deepbooru等模型按需替换掉当前标签) \n3、针对你重点关注的那些类别,可以手动查看具体图像内容,主动删一些生成效果不好的学习样本再训练 (相当于人类偏好选择,会提升模型最终的质量表现)", "## 下载方式\n\naistudio \nURL \n\nhuggingface \nURL \n\ntip: 如果你发现有哪些想要的类别当前没有,欢迎提出建议我们会再进行数据扩充。" ]
7645214f5adccf425e825452034ce5bec0a6a16a
# Dataset of beatrix/べあとりくす (Granblue Fantasy) This is the dataset of beatrix/べあとりくす (Granblue Fantasy), containing 386 images and their tags. The core tags of this character are `brown_hair, long_hair, breasts, ponytail, large_breasts, bangs, brown_eyes, hair_ornament`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 386 | 473.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 386 | 302.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 875 | 602.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 386 | 431.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 875 | 809.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/beatrix_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/beatrix_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, smile, solo, looking_at_viewer, navel, open_mouth, blush, bare_shoulders, blue_bikini, sunglasses, eyewear_on_head, collarbone, thigh_strap, day, outdoors, sky, wrist_cuffs, ;d, green_eyes, one_eye_closed, asymmetrical_bangs | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, cleavage, looking_at_viewer, navel, solo, blush, simple_background, smile, blue_bikini, detached_collar, cowboy_shot, halterneck, open_mouth, swept_bangs, white_background | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, looking_at_viewer, solo, cleavage, simple_background, tears, navel, white_background, open_mouth, torn_clothes, upper_body | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blue_ribbon, hair_ribbon, looking_at_viewer, solo, upper_body, simple_background, white_background, asymmetrical_bangs, cleavage, armor, blush, medium_breasts, smile | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, armored_boots, cleavage, full_body, gauntlets, holding_sword, looking_at_viewer, midriff, navel, short_shorts, smile, solo, standing, thighhighs, belt, closed_mouth, simple_background, white_background, black_shorts, shoulder_armor, asymmetrical_bangs, floating_hair, gloves, green_eyes, medium_breasts, stomach, thigh_gap | | 5 | 29 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, solo, witch_hat, cleavage, looking_at_viewer, black_gloves, blush, detached_sleeves, bare_shoulders, navel, smile, halloween_costume, open_mouth, midriff, striped_thighhighs | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, blush, hetero, nipples, open_mouth, penis, sex, vaginal, 1boy, navel, solo_focus, spread_legs, collarbone, mosaic_censoring, white_background, arms_behind_back, asymmetrical_bangs, belt, boots, cum_in_mouth, cum_in_pussy, cum_on_breasts, facial, heavy_breathing, simple_background, tears, very_long_hair | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | smile | solo | looking_at_viewer | navel | open_mouth | blush | bare_shoulders | blue_bikini | sunglasses | eyewear_on_head | collarbone | thigh_strap | day | outdoors | sky | wrist_cuffs | ;d | green_eyes | one_eye_closed | asymmetrical_bangs | simple_background | detached_collar | cowboy_shot | halterneck | swept_bangs | white_background | tears | torn_clothes | upper_body | blue_ribbon | hair_ribbon | armor | medium_breasts | armored_boots | full_body | gauntlets | holding_sword | midriff | short_shorts | standing | thighhighs | belt | closed_mouth | black_shorts | shoulder_armor | floating_hair | gloves | stomach | thigh_gap | witch_hat | black_gloves | detached_sleeves | halloween_costume | striped_thighhighs | hetero | nipples | penis | sex | vaginal | 1boy | solo_focus | spread_legs | mosaic_censoring | arms_behind_back | boots | cum_in_mouth | cum_in_pussy | cum_on_breasts | facial | heavy_breathing | very_long_hair | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:-------|:--------------------|:--------|:-------------|:--------|:-----------------|:--------------|:-------------|:------------------|:-------------|:--------------|:------|:-----------|:------|:--------------|:-----|:-------------|:-----------------|:---------------------|:--------------------|:------------------|:--------------|:-------------|:--------------|:-------------------|:--------|:---------------|:-------------|:--------------|:--------------|:--------|:-----------------|:----------------|:------------|:------------|:----------------|:----------|:---------------|:-----------|:-------------|:-------|:---------------|:---------------|:-----------------|:----------------|:---------|:----------|:------------|:------------|:---------------|:-------------------|:--------------------|:---------------------|:---------|:----------|:--------|:------|:----------|:-------|:-------------|:--------------|:-------------------|:-------------------|:--------|:---------------|:---------------|:-----------------|:---------|:------------------|:-----------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | | | X | | | | | | | | | | | | | | X | X | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | X | | | | | | | | | | | | | | X | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 5 | 29 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | | X | X | X | | | | | X | | | | | | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/beatrix_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T14:25:07+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:44:28+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of beatrix/べあとりくす (Granblue Fantasy) ============================================ This is the dataset of beatrix/べあとりくす (Granblue Fantasy), containing 386 images and their tags. The core tags of this character are 'brown\_hair, long\_hair, breasts, ponytail, large\_breasts, bangs, brown\_eyes, hair\_ornament', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
103915ed11c188881c8411a4f40d4f3b0afb3dbd
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
psugam/nepali_grammar_check
[ "region:us" ]
2024-01-21T14:45:20+00:00
{}
2024-01-21T15:27:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
bbe9c6bcea5a8c930071c2f1aa6e25367dc2131d
# Dataset of galleon/ガレヲン (Granblue Fantasy) This is the dataset of galleon/ガレヲン (Granblue Fantasy), containing 325 images and their tags. The core tags of this character are `brown_hair, long_hair, animal_ears, horns, breasts, pointy_ears, extra_ears, bangs, multicolored_hair, large_breasts, streaked_hair, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 325 | 557.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 325 | 293.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 815 | 666.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 325 | 483.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 815 | 988.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galleon_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/galleon_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_dress, closed_eyes, detached_sleeves, frilled_sleeves, solo, white_gloves, bare_shoulders, blush | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, closed_eyes, detached_sleeves, frilled_sleeves, solo, white_gloves, asymmetrical_hair, upper_body | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, asymmetrical_hair, asymmetrical_legwear, closed_eyes, detached_sleeves, frilled_sleeves, solo, thigh_strap, white_gloves, pelvic_curtain, black_dress, full_body, hair_between_eyes, single_thighhigh | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, closed_eyes, hetero, solo_focus, blush, detached_sleeves, white_gloves, nipples, paizuri, huge_breasts, mosaic_censoring, kissing_penis, nude | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_dress, blindfold, cleavage, solo, blue_hair, smile, thigh_strap, long_sleeves, mask, nail_polish, closed_mouth, facing_viewer, parted_lips | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, cleavage, closed_eyes, navel, solo, bikini, hair_between_eyes, thighs, blush, blue_hair, collarbone, thigh_strap, wet | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, closed_eyes, solo, cleavage, collared_shirt, long_sleeves, white_shirt, blue_hair, blush, smile, collarbone, hair_between_eyes, naked_shirt, navel, sitting | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, closed_eyes, completely_nude, hair_between_eyes, solo, smile, collarbone, nipples, artist_name, barefoot, blue_hair, blush, closed_mouth, lips, navel | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | closed_eyes | detached_sleeves | frilled_sleeves | solo | white_gloves | bare_shoulders | blush | asymmetrical_hair | upper_body | asymmetrical_legwear | thigh_strap | pelvic_curtain | full_body | hair_between_eyes | single_thighhigh | 1boy | hetero | solo_focus | nipples | paizuri | huge_breasts | mosaic_censoring | kissing_penis | nude | blindfold | cleavage | blue_hair | smile | long_sleeves | mask | nail_polish | closed_mouth | facing_viewer | parted_lips | navel | bikini | thighs | collarbone | wet | collared_shirt | white_shirt | naked_shirt | sitting | completely_nude | artist_name | barefoot | lips | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:-------------------|:------------------|:-------|:---------------|:-----------------|:--------|:--------------------|:-------------|:-----------------------|:--------------|:-----------------|:------------|:--------------------|:-------------------|:-------|:---------|:-------------|:----------|:----------|:---------------|:-------------------|:----------------|:-------|:------------|:-----------|:------------|:--------|:---------------|:-------|:--------------|:---------------|:----------------|:--------------|:--------|:---------|:---------|:-------------|:------|:-----------------|:--------------|:--------------|:----------|:------------------|:--------------|:-----------|:-------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | X | | X | X | | | | X | | | X | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | | | | | | | | | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | X | | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | | | | | | X | | | X | | X | X | X | X | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | | X | | | X | | | | | | | X | | | | | X | | | | | | | | X | X | | | | X | | | X | | | X | | | | | | X | X | X | X |
CyberHarem/galleon_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T14:47:32+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:58:35+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of galleon/ガレヲン (Granblue Fantasy) ========================================== This is the dataset of galleon/ガレヲン (Granblue Fantasy), containing 325 images and their tags. The core tags of this character are 'brown\_hair, long\_hair, animal\_ears, horns, breasts, pointy\_ears, extra\_ears, bangs, multicolored\_hair, large\_breasts, streaked\_hair, very\_long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
aaeae7e201397b1671ba07681ea7cda2d62c8b63
# Dataset of sorn/ソーン (Granblue Fantasy) This is the dataset of sorn/ソーン (Granblue Fantasy), containing 248 images and their tags. The core tags of this character are `long_hair, breasts, brown_hair, hair_ornament, head_wings, bangs, large_breasts, brown_eyes, hairband`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 248 | 369.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sorn_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 248 | 208.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sorn_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 599 | 443.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sorn_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 248 | 327.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sorn_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 599 | 631.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sorn_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sorn_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cape, midriff, navel, solo, thighhighs, bow_(weapon), smile, arrow_(projectile), looking_at_viewer, open_mouth, shorts, high_heels, white_gloves, orange_hair | | 1 | 35 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, bare_shoulders, cleavage, looking_at_viewer, official_alternate_costume, white_bikini, choker, collarbone, hair_flower, blush, navel, sidelocks, simple_background, white_background, smile, bracelet, thighs, bridal_garter, open_mouth, french_braid | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, blue_sky, blush, cleavage, collarbone, day, looking_at_viewer, navel, official_alternate_costume, smile, solo, white_bikini, choker, outdoors, sidelocks, beach, braid, ocean, orange_hair, green_eyes, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | midriff | navel | solo | thighhighs | bow_(weapon) | smile | arrow_(projectile) | looking_at_viewer | open_mouth | shorts | high_heels | white_gloves | orange_hair | bare_shoulders | cleavage | official_alternate_costume | white_bikini | choker | collarbone | hair_flower | blush | sidelocks | simple_background | white_background | bracelet | thighs | bridal_garter | french_braid | blue_sky | day | outdoors | beach | braid | ocean | green_eyes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------|:-------|:-------------|:---------------|:--------|:---------------------|:--------------------|:-------------|:---------|:-------------|:---------------|:--------------|:-----------------|:-----------|:-----------------------------|:---------------|:---------|:-------------|:--------------|:--------|:------------|:--------------------|:-------------------|:-----------|:---------|:----------------|:---------------|:-----------|:------|:-----------|:--------|:--------|:--------|:-------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 1 | 35 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | | | X | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | | | X | | X | | | | | X | X | X | X | X | X | X | | X | X | | | | X | | | X | X | X | X | X | X | X |
CyberHarem/sorn_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:01:46+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:56:06+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sorn/ソーン (Granblue Fantasy) ====================================== This is the dataset of sorn/ソーン (Granblue Fantasy), containing 248 images and their tags. The core tags of this character are 'long\_hair, breasts, brown\_hair, hair\_ornament, head\_wings, bangs, large\_breasts, brown\_eyes, hairband', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
e0e51512cdd45b4ac28c4cf7cc4ae235d3bb3320
# Dataset of makira/マキラ (Granblue Fantasy) This is the dataset of makira/マキラ (Granblue Fantasy), containing 252 images and their tags. The core tags of this character are `blonde_hair, long_hair, hair_ornament, animal_ears, red_eyes, bangs, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 252 | 338.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 252 | 203.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 577 | 434.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 252 | 302.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 577 | 599.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/makira_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 44 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bandeau, detached_sleeves, solo, harvin, bare_shoulders, wide_sleeves, navel, looking_at_viewer, feathers, midriff, black_thighhighs, blush, collarbone, small_breasts, pelvic_curtain, sitting, chicken | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bandeau, bare_shoulders, detached_sleeves, feathers, harvin, looking_at_viewer, solo, wide_sleeves, chicken, simple_background, hair_beads, white_background, black_thighhighs, parted_lips | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, blush, harvin, looking_at_viewer, simple_background, solo, white_background, collarbone, detached_sleeves, hair_beads, long_sleeves, parted_lips, wide_sleeves, :o, bandeau, chicken, earrings, upper_body, bracelet, dated, feather_hair_ornament, hands_up, holding_animal | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, solo, twin_braids, long_sleeves, smile, wings, red_ribbon, twintails, white_pantyhose, blush, harvin, red_footwear, shoes, star_(symbol), white_dress, candle, chicken, hat, pointy_ears, sitting | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandeau | detached_sleeves | solo | harvin | bare_shoulders | wide_sleeves | navel | looking_at_viewer | feathers | midriff | black_thighhighs | blush | collarbone | small_breasts | pelvic_curtain | sitting | chicken | simple_background | hair_beads | white_background | parted_lips | long_sleeves | :o | earrings | upper_body | bracelet | dated | feather_hair_ornament | hands_up | holding_animal | twin_braids | smile | wings | red_ribbon | twintails | white_pantyhose | red_footwear | shoes | star_(symbol) | white_dress | candle | hat | pointy_ears | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------------------|:-------|:---------|:-----------------|:---------------|:--------|:--------------------|:-----------|:----------|:-------------------|:--------|:-------------|:----------------|:-----------------|:----------|:----------|:--------------------|:-------------|:-------------------|:--------------|:---------------|:-----|:-----------|:-------------|:-----------|:--------|:------------------------|:-----------|:-----------------|:--------------|:--------|:--------|:-------------|:------------|:------------------|:---------------|:--------|:----------------|:--------------|:---------|:------|:--------------| | 0 | 44 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | X | X | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | X | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | | | | X | | | | X | | | | X | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/makira_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:01:55+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:49:57+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of makira/マキラ (Granblue Fantasy) ======================================== This is the dataset of makira/マキラ (Granblue Fantasy), containing 252 images and their tags. The core tags of this character are 'blonde\_hair, long\_hair, hair\_ornament, animal\_ears, red\_eyes, bangs, very\_long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
c004adf6528596e2551cb2dd8965d949d3c9d884
# Dataset of korwa/コルワ (Granblue Fantasy) This is the dataset of korwa/コルワ (Granblue Fantasy), containing 282 images and their tags. The core tags of this character are `long_hair, animal_ears, bangs, breasts, blue_eyes, blunt_bangs, hair_ornament, large_breasts, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 282 | 349.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korwa_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 282 | 220.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korwa_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 654 | 444.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korwa_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 282 | 317.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korwa_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 654 | 588.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korwa_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/korwa_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, dress, elbow_gloves, erune, solo, looking_at_viewer, mismatched_legwear, thighhighs, smile, quill, white_gloves, white_background, simple_background, blush, cat_ears, sitting, open_mouth | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_jacket, erune, looking_at_viewer, open_jacket, ribbed_dress, smile, solo, thighhighs, belt, long_sleeves, mismatched_legwear, quill, simple_background, white_background, blush, parted_lips, school_uniform, crossed_legs, feathers, full_body, grey_eyes, holding, sitting, skirt | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, erune, official_alternate_costume, smile, solo, cleavage, looking_at_viewer, simple_background, hair_flower, parted_lips, white_background, white_bikini, navel, blush, collarbone, very_long_hair, bracelet | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | dress | elbow_gloves | erune | solo | looking_at_viewer | mismatched_legwear | thighhighs | smile | quill | white_gloves | white_background | simple_background | blush | cat_ears | sitting | open_mouth | black_jacket | open_jacket | ribbed_dress | belt | long_sleeves | parted_lips | school_uniform | crossed_legs | feathers | full_body | grey_eyes | holding | skirt | official_alternate_costume | cleavage | hair_flower | white_bikini | navel | collarbone | very_long_hair | bracelet | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:---------------|:--------|:-------|:--------------------|:---------------------|:-------------|:--------|:--------|:---------------|:-------------------|:--------------------|:--------|:-----------|:----------|:-------------|:---------------|:--------------|:---------------|:-------|:---------------|:--------------|:-----------------|:---------------|:-----------|:------------|:------------|:----------|:--------|:-----------------------------|:-----------|:--------------|:---------------|:--------|:-------------|:-----------------|:-----------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | X | X | X | X | X | X | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | X | X | | | X | | | X | X | X | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X |
CyberHarem/korwa_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:01:56+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:58:45+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of korwa/コルワ (Granblue Fantasy) ======================================= This is the dataset of korwa/コルワ (Granblue Fantasy), containing 282 images and their tags. The core tags of this character are 'long\_hair, animal\_ears, bangs, breasts, blue\_eyes, blunt\_bangs, hair\_ornament, large\_breasts, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
21a4d622809cc1a6850a171fb269362dd4ef1864
# Dataset of nio/ニオ (Granblue Fantasy) This is the dataset of nio/ニオ (Granblue Fantasy), containing 152 images and their tags. The core tags of this character are `hair_over_one_eye, purple_hair, pointy_ears, long_hair, hair_ornament, ponytail, purple_eyes, red_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 152 | 176.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 152 | 109.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 354 | 236.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 152 | 162.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 354 | 321.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nio_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nio_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, harvin, solo, looking_at_viewer, navel_cutout, blush, cape, bare_shoulders, black_thighhighs, dress, breasts, simple_background, white_background | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, hair_flower, harvin, obi, solo, looking_at_viewer, paper_fan, wide_sleeves, yukata, smile, blue_kimono, holding_fan, long_sleeves, parted_lips, small_breasts | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, blush, looking_at_viewer, official_alternate_costume, open_mouth, paw_gloves, bangs, twintails, fang, pantyhose, very_long_hair, fur_trim, jack-o'-lantern, lion_tail, bow, claw_pose, halloween_costume, hood, orange_dress, red_necktie, braid, fake_animal_ears, harvin, sleeveless_dress, star_(symbol), white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | harvin | solo | looking_at_viewer | navel_cutout | blush | cape | bare_shoulders | black_thighhighs | dress | breasts | simple_background | white_background | hair_flower | obi | paper_fan | wide_sleeves | yukata | smile | blue_kimono | holding_fan | long_sleeves | parted_lips | small_breasts | official_alternate_costume | open_mouth | paw_gloves | bangs | twintails | fang | pantyhose | very_long_hair | fur_trim | jack-o'-lantern | lion_tail | bow | claw_pose | halloween_costume | hood | orange_dress | red_necktie | braid | fake_animal_ears | sleeveless_dress | star_(symbol) | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------------------|:---------------|:--------|:-------|:-----------------|:-------------------|:--------|:----------|:--------------------|:-------------------|:--------------|:------|:------------|:---------------|:---------|:--------|:--------------|:--------------|:---------------|:--------------|:----------------|:-----------------------------|:-------------|:-------------|:--------|:------------|:-------|:------------|:-----------------|:-----------|:------------------|:------------|:------|:------------|:--------------------|:-------|:---------------|:--------------|:--------|:-------------------|:-------------------|:----------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/nio_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:02:17+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T15:32:42+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of nio/ニオ (Granblue Fantasy) ==================================== This is the dataset of nio/ニオ (Granblue Fantasy), containing 152 images and their tags. The core tags of this character are 'hair\_over\_one\_eye, purple\_hair, pointy\_ears, long\_hair, hair\_ornament, ponytail, purple\_eyes, red\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
7a3da728a92f64407c8047c6ac154dc939146700
# Dataset of heles/ヘルエス (Granblue Fantasy) This is the dataset of heles/ヘルエス (Granblue Fantasy), containing 330 images and their tags. The core tags of this character are `long_hair, animal_ears, breasts, cat_ears, grey_hair, braid, very_long_hair, large_breasts, single_braid, hair_between_eyes, brown_eyes, yellow_eyes, hairband`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 330 | 410.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 330 | 265.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 768 | 538.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 330 | 373.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 768 | 701.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/heles_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bracelet, cleavage, cloud, day, ears_through_headwear, erune, hair_tubes, hat_flower, looking_at_viewer, official_alternate_costume, solo, sun_hat, blue_sky, blush, collarbone, covered_navel, outdoors, smile, straw_hat, white_one-piece_swimsuit, bare_shoulders, frills, hibiscus, ocean, bangs, cowboy_shot, armlet, beach, hand_on_headwear, sarong, water | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, ears_through_headwear, erune, hair_tubes, hat_flower, looking_at_viewer, official_alternate_costume, solo, sun_hat, bracelet, smile, bare_shoulders, straw_hat, covered_navel, blush, white_one-piece_swimsuit, armlet, hibiscus, collarbone, sarong | | 2 | 28 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, erune, solo, hair_tubes, looking_at_viewer, cleavage, gloves, thighhighs, spear, armored_dress, holding_weapon, smile, simple_background | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, cleavage, erune, pauldrons, solo, armored_dress, gauntlets, gloves, hair_tubes, looking_at_viewer, simple_background, sitting, thighhighs, thighs | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, erune, hair_tubes, looking_at_viewer, solo, upper_body, cleavage, simple_background, white_background, pauldrons, smile | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, armpits, arms_up, erune, looking_at_viewer, solo, upper_body, blush, cleavage, sweat, hair_tubes, simple_background, arms_behind_head, closed_mouth, elbow_gloves, white_background | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, elbow_gloves, erune, looking_at_viewer, smile, solo, black_dress, cleavage, hair_tubes, official_alternate_costume, simple_background, white_gloves, bangs, blush, covered_navel, sidelocks, thighhighs, jewelry, white_background, backless_dress, thighs, white_hair | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, erune, hetero, nipples, solo_focus, 1boy, hair_tubes, cum, gloves, penis, armor, ass, mosaic_censoring, nude, open_mouth | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bracelet | cleavage | cloud | day | ears_through_headwear | erune | hair_tubes | hat_flower | looking_at_viewer | official_alternate_costume | solo | sun_hat | blue_sky | blush | collarbone | covered_navel | outdoors | smile | straw_hat | white_one-piece_swimsuit | bare_shoulders | frills | hibiscus | ocean | bangs | cowboy_shot | armlet | beach | hand_on_headwear | sarong | water | gloves | thighhighs | spear | armored_dress | holding_weapon | simple_background | pauldrons | gauntlets | sitting | thighs | upper_body | white_background | armpits | arms_up | sweat | arms_behind_head | closed_mouth | elbow_gloves | black_dress | white_gloves | sidelocks | jewelry | backless_dress | white_hair | hetero | nipples | solo_focus | 1boy | cum | penis | armor | ass | mosaic_censoring | nude | open_mouth | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-----------|:--------|:------|:------------------------|:--------|:-------------|:-------------|:--------------------|:-----------------------------|:-------|:----------|:-----------|:--------|:-------------|:----------------|:-----------|:--------|:------------|:---------------------------|:-----------------|:---------|:-----------|:--------|:--------|:--------------|:---------|:--------|:-------------------|:---------|:--------|:---------|:-------------|:--------|:----------------|:-----------------|:--------------------|:------------|:------------|:----------|:---------|:-------------|:-------------------|:----------|:----------|:--------|:-------------------|:---------------|:---------------|:--------------|:---------------|:------------|:----------|:-----------------|:-------------|:---------|:----------|:-------------|:-------|:------|:--------|:--------|:------|:-------------------|:-------|:-------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | X | X | X | X | X | X | X | X | | X | X | X | | X | X | X | X | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 28 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | | X | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | | X | X | | X | X | X | | | X | | X | | X | | | X | | | | X | | | | | | | | X | | | | X | | | | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/heles_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:24:58+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:39:35+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of heles/ヘルエス (Granblue Fantasy) ======================================== This is the dataset of heles/ヘルエス (Granblue Fantasy), containing 330 images and their tags. The core tags of this character are 'long\_hair, animal\_ears, breasts, cat\_ears, grey\_hair, braid, very\_long\_hair, large\_breasts, single\_braid, hair\_between\_eyes, brown\_eyes, yellow\_eyes, hairband', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
ff278c9f866c575577c6792395cc2085451b5ed5
# Dataset of sarasa/サラーサ (Granblue Fantasy) This is the dataset of sarasa/サラーサ (Granblue Fantasy), containing 200 images and their tags. The core tags of this character are `long_hair, horns, breasts, hair_between_eyes, grey_hair, large_breasts, red_eyes, bangs, ahoge, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 200 | 239.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarasa_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 200 | 163.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarasa_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 458 | 331.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarasa_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 200 | 223.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarasa_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 458 | 425.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sarasa_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sarasa_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, draph, solo, red_skirt, armor, cape, smile, looking_at_viewer, thighhighs, holding_weapon, axe, elbow_gloves, open_mouth, black_gloves, boots | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, draph, looking_at_viewer, nipples, nude, solo, fang, navel, open_mouth, pointy_ears, smile, upper_body | | 2 | 34 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, draph, looking_at_viewer, solo, blush, official_alternate_costume, red_bikini, collarbone, hair_flower, cleavage, smile, hibiscus, see-through, bare_shoulders, red_flower, sailor_collar, shirt, open_mouth, navel, simple_background, pointy_ears, white_background | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, draph, nipples, 1boy, hetero, solo_focus, penis, open_mouth, pointy_ears, pussy, navel, sex, spread_legs, sweat, bar_censor, nude, on_back, vaginal, cum, pubic_hair, smile, bikini | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | solo | red_skirt | armor | cape | smile | looking_at_viewer | thighhighs | holding_weapon | axe | elbow_gloves | open_mouth | black_gloves | boots | blush | nipples | nude | fang | navel | pointy_ears | upper_body | official_alternate_costume | red_bikini | collarbone | hair_flower | cleavage | hibiscus | see-through | bare_shoulders | red_flower | sailor_collar | shirt | simple_background | white_background | 1boy | hetero | solo_focus | penis | pussy | sex | spread_legs | sweat | bar_censor | on_back | vaginal | cum | pubic_hair | bikini | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:------------|:--------|:-------|:--------|:--------------------|:-------------|:-----------------|:------|:---------------|:-------------|:---------------|:--------|:--------|:----------|:-------|:-------|:--------|:--------------|:-------------|:-----------------------------|:-------------|:-------------|:--------------|:-----------|:-----------|:--------------|:-----------------|:-------------|:----------------|:--------|:--------------------|:-------------------|:-------|:---------|:-------------|:--------|:--------|:------|:--------------|:--------|:-------------|:----------|:----------|:------|:-------------|:---------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | X | X | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 34 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | | | X | X | | | | | X | | | X | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | X | | | | | | X | | | X | X | X | | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/sarasa_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:25:01+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:09:04+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sarasa/サラーサ (Granblue Fantasy) ========================================= This is the dataset of sarasa/サラーサ (Granblue Fantasy), containing 200 images and their tags. The core tags of this character are 'long\_hair, horns, breasts, hair\_between\_eyes, grey\_hair, large\_breasts, red\_eyes, bangs, ahoge, very\_long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
9ce132e5a10342c9c1a761d045a1c8354d785e44
# Dataset of yaia/ヤイア (Granblue Fantasy) This is the dataset of yaia/ヤイア (Granblue Fantasy), containing 191 images and their tags. The core tags of this character are `brown_hair, short_hair, horns, hairband, breasts, brown_eyes, large_breasts, hair_ornament`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 191 | 207.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yaia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 191 | 125.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yaia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 466 | 278.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yaia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 191 | 184.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yaia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 466 | 384.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yaia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/yaia_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, christmas, bell, blush, solo, draph, fur_trim, looking_at_viewer, mittens, open_mouth, reindeer_antlers, santa_costume, hood, skirt, smile, santa_hat, bangs, capelet, dress, fake_antlers, oppai_loli, white_background | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, draph, hair_bobbles, looking_at_viewer, oppai_loli, smile, solo, teddy_bear, ;d, blush, capelet, one_eye_closed, open_mouth, skirt, white_thighhighs, belt, bangs, long_sleeves, mary_janes, pouch, simple_background, white_shirt | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, draph, open_mouth, oppai_loli, solo, white_thighhighs, hair_bobbles, looking_at_viewer, smile, teddy_bear, petite, white_panties, ?, cameltoe, mary_janes, pink_panties, skirt_lift | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, blush, draph, oppai_loli, solo_focus, smile, penis, open_mouth, paizuri_under_clothes, pov, cum, huge_breasts, looking_at_viewer | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, 1girl, draph, hetero, nipples, oppai_loli, penis, solo_focus, blush, open_mouth, sex, vaginal, hair_bobbles, nude, thighhighs, petite, bar_censor, cum_in_pussy, lying, mosaic_censoring, navel, spread_legs, tears, teddy_bear | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | christmas | bell | blush | solo | draph | fur_trim | looking_at_viewer | mittens | open_mouth | reindeer_antlers | santa_costume | hood | skirt | smile | santa_hat | bangs | capelet | dress | fake_antlers | oppai_loli | white_background | hair_bobbles | teddy_bear | ;d | one_eye_closed | white_thighhighs | belt | long_sleeves | mary_janes | pouch | simple_background | white_shirt | petite | white_panties | ? | cameltoe | pink_panties | skirt_lift | 1boy | solo_focus | penis | paizuri_under_clothes | pov | cum | huge_breasts | hetero | nipples | sex | vaginal | nude | thighhighs | bar_censor | cum_in_pussy | lying | mosaic_censoring | navel | spread_legs | tears | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:--------|:-------|:--------|:-----------|:--------------------|:----------|:-------------|:-------------------|:----------------|:-------|:--------|:--------|:------------|:--------|:----------|:--------|:---------------|:-------------|:-------------------|:---------------|:-------------|:-----|:-----------------|:-------------------|:-------|:---------------|:-------------|:--------|:--------------------|:--------------|:---------|:----------------|:----|:-----------|:---------------|:-------------|:-------|:-------------|:--------|:------------------------|:------|:------|:---------------|:---------|:----------|:------|:----------|:-------|:-------------|:-------------|:---------------|:--------|:-------------------|:--------|:--------------|:--------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | X | | X | | X | | | | X | X | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | X | | X | | X | | | | | X | | | | | | X | | X | X | | | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | X | | X | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | | | | X | | | | | | | | | | | X | | X | X | | | | | | | | | | X | | | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/yaia_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:25:22+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:04:09+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of yaia/ヤイア (Granblue Fantasy) ====================================== This is the dataset of yaia/ヤイア (Granblue Fantasy), containing 191 images and their tags. The core tags of this character are 'brown\_hair, short\_hair, horns, hairband, breasts, brown\_eyes, large\_breasts, hair\_ornament', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
c0714bf9134a5015aa1492df96a68ceba0ac4dca
# Dataset of socie/ソシエ (Granblue Fantasy) This is the dataset of socie/ソシエ (Granblue Fantasy), containing 243 images and their tags. The core tags of this character are `animal_ears, long_hair, breasts, blue_eyes, hair_ornament, fox_ears, large_breasts, tail, bangs, fox_tail, very_long_hair, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 243 | 331.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 243 | 212.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 549 | 418.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 243 | 303.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 549 | 559.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/socie_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, collarbone, erune, solo, blush, simple_background, white_background, fur_trim, looking_at_viewer, detached_sleeves, upper_body | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, erune, fox_shadow_puppet, looking_at_viewer, smile, solo, blush, detached_sleeves, collarbone, sideboob | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, detached_sleeves, erune, looking_at_viewer, sideboob, solo, bare_back, looking_back, backless_outfit, smile, blush | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blunt_bangs, cleavage, erune, looking_at_viewer, navel, official_alternate_costume, smile, solo, bare_shoulders, parted_lips, simple_background, white_background, white_bikini, blush, hair_flower, bracelet, collarbone, holding, quill, see-through | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, elbow_gloves, erune, looking_at_viewer, solo, white_gloves, blunt_bangs, smile, blush, thighhighs, cleavage, quill, mismatched_legwear, white_dress, fingerless_gloves, parted_lips, holding, sitting, cat_ears, simple_background | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_jacket, erune, looking_at_viewer, open_jacket, smile, solo, thighhighs, blunt_bangs, blush, long_sleeves, mismatched_legwear, parted_lips, ribbed_dress, belt, feathers, quill, simple_background, crossed_legs, one_eye_closed, sitting, thighs, white_background | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, blush, erune, nipples, open_mouth, solo_focus, sweat, hetero, navel, fang, nude, penis, pussy_juice, barefoot, censored, collarbone, detached_sleeves, feet, heart-shaped_pupils, looking_at_viewer, saliva, sex_from_behind, spread_legs, tears, tongue_out | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | collarbone | erune | solo | blush | simple_background | white_background | fur_trim | looking_at_viewer | detached_sleeves | upper_body | fox_shadow_puppet | smile | sideboob | bare_back | looking_back | backless_outfit | blunt_bangs | navel | official_alternate_costume | bare_shoulders | parted_lips | white_bikini | hair_flower | bracelet | holding | quill | see-through | elbow_gloves | white_gloves | thighhighs | mismatched_legwear | white_dress | fingerless_gloves | sitting | cat_ears | black_jacket | open_jacket | long_sleeves | ribbed_dress | belt | feathers | crossed_legs | one_eye_closed | thighs | 1boy | nipples | open_mouth | solo_focus | sweat | hetero | fang | nude | penis | pussy_juice | barefoot | censored | feet | heart-shaped_pupils | saliva | sex_from_behind | spread_legs | tears | tongue_out | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------------|:--------|:-------|:--------|:--------------------|:-------------------|:-----------|:--------------------|:-------------------|:-------------|:--------------------|:--------|:-----------|:------------|:---------------|:------------------|:--------------|:--------|:-----------------------------|:-----------------|:--------------|:---------------|:--------------|:-----------|:----------|:--------|:--------------|:---------------|:---------------|:-------------|:---------------------|:--------------|:--------------------|:----------|:-----------|:---------------|:--------------|:---------------|:---------------|:-------|:-----------|:---------------|:-----------------|:---------|:-------|:----------|:-------------|:-------------|:--------|:---------|:-------|:-------|:--------|:--------------|:-----------|:-----------|:-------|:----------------------|:---------|:------------------|:--------------|:--------|:-------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | X | | | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | X | X | X | | | X | | | | X | | | | | X | | | X | X | | | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | X | X | X | | X | | | | X | | | | | X | | | | X | | | | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | | X | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/socie_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:34:25+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:22:29+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of socie/ソシエ (Granblue Fantasy) ======================================= This is the dataset of socie/ソシエ (Granblue Fantasy), containing 243 images and their tags. The core tags of this character are 'animal\_ears, long\_hair, breasts, blue\_eyes, hair\_ornament, fox\_ears, large\_breasts, tail, bangs, fox\_tail, very\_long\_hair, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
13042700309a0084a8d26290df376202cb459ddf
# Dataset of sen/セン (Granblue Fantasy) This is the dataset of sen/セン (Granblue Fantasy), containing 271 images and their tags. The core tags of this character are `animal_ears, grey_hair, hair_between_eyes, breasts, long_hair, medium_breasts, bangs, red_eyes, fang`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 271 | 308.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 271 | 211.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 645 | 426.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 271 | 288.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 645 | 538.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sen_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sen_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, claw_(weapon), claws, erune, looking_at_viewer, solo, collar, open_mouth, smile, gloves, white_background, simple_background, blush, boots, pleated_skirt, fangs, orange_eyes | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, claw_(weapon), claws, erune, gloves, hair_flower, open_mouth, skirt, solo, sleeveless, white_flower, brown_eyes, :d, blush, looking_at_viewer, white_background, bare_shoulders, collarbone, fangs, hood_down | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_collar, blush, brown_eyes, erune, hair_flower, looking_at_viewer, smile, solo, upper_body, white_flower, ahoge, animal_ear_fluff, cleavage, collarbone, hood_down, :3, bare_shoulders, closed_mouth, jacket | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | claw_(weapon) | claws | erune | looking_at_viewer | solo | collar | open_mouth | smile | gloves | white_background | simple_background | blush | boots | pleated_skirt | fangs | orange_eyes | hair_flower | skirt | sleeveless | white_flower | brown_eyes | :d | bare_shoulders | collarbone | hood_down | black_collar | upper_body | ahoge | animal_ear_fluff | cleavage | :3 | closed_mouth | jacket | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------|:--------|:--------------------|:-------|:---------|:-------------|:--------|:---------|:-------------------|:--------------------|:--------|:--------|:----------------|:--------|:--------------|:--------------|:--------|:-------------|:---------------|:-------------|:-----|:-----------------|:-------------|:------------|:---------------|:-------------|:--------|:-------------------|:-----------|:-----|:---------------|:---------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | X | | | X | | | | X | | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/sen_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:34:40+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:22:58+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sen/セン (Granblue Fantasy) ==================================== This is the dataset of sen/セン (Granblue Fantasy), containing 271 images and their tags. The core tags of this character are 'animal\_ears, grey\_hair, hair\_between\_eyes, breasts, long\_hair, medium\_breasts, bangs, red\_eyes, fang', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
142ef59056a142c3184a0d9d556048b74c8c1e12
# Dataset of sara/サラ (Granblue Fantasy) This is the dataset of sara/サラ (Granblue Fantasy), containing 207 images and their tags. The core tags of this character are `long_hair, orange_hair, hairband, breasts, blue_eyes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 207 | 262.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 207 | 163.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 477 | 335.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 207 | 239.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 477 | 456.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sara_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cape, dress, solo, barefoot, looking_at_viewer, anklet, smile | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, hair_flower, looking_at_viewer, solo, cape, blush, navel, smile, bikini, small_breasts, white_background | | 2 | 22 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, green_eyes, looking_at_viewer, ponytail, brown_hair, hair_ribbon, blush, skirt, black_gloves, medium_breasts, black_thighhighs, sleeveless, smile, cape, armpits, open_mouth, white_background, simple_background, very_long_hair | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_thighhighs, looking_at_viewer, santa_hat, solo, fur_trim, black_gloves, christmas, navel, ponytail, santa_bikini, medium_breasts, open_mouth, blush, boots, cleavage, green_eyes, high_heels, very_long_hair, brown_hair, red_cape, simple_background, :d, aqua_eyes, one_eye_closed, red_bikini | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, blush, detached_sleeves, red_skirt, solo, aqua_eyes, hair_ribbon, looking_at_viewer, medium_breasts, plaid_skirt, holding, long_sleeves, scarf, simple_background, thighhighs, twintails, white_background, white_shirt, apron, brown_hair, closed_mouth, green_eyes, hair_intakes, valentine, very_long_hair | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | dress | solo | barefoot | looking_at_viewer | anklet | smile | hair_flower | blush | navel | bikini | small_breasts | white_background | green_eyes | ponytail | brown_hair | hair_ribbon | skirt | black_gloves | medium_breasts | black_thighhighs | sleeveless | armpits | open_mouth | simple_background | very_long_hair | santa_hat | fur_trim | christmas | santa_bikini | boots | cleavage | high_heels | red_cape | :d | aqua_eyes | one_eye_closed | red_bikini | bare_shoulders | detached_sleeves | red_skirt | plaid_skirt | holding | long_sleeves | scarf | thighhighs | twintails | white_shirt | apron | closed_mouth | hair_intakes | valentine | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------|:-----------|:--------------------|:---------|:--------|:--------------|:--------|:--------|:---------|:----------------|:-------------------|:-------------|:-----------|:-------------|:--------------|:--------|:---------------|:-----------------|:-------------------|:-------------|:----------|:-------------|:--------------------|:-----------------|:------------|:-----------|:------------|:---------------|:--------|:-----------|:-------------|:-----------|:-----|:------------|:-----------------|:-------------|:-----------------|:-------------------|:------------|:--------------|:----------|:---------------|:--------|:-------------|:------------|:--------------|:--------|:---------------|:---------------|:------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 22 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | X | | | | X | X | | | | X | X | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | | | | X | | | | X | X | | X | X | | | X | | | | | X | X | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/sara_granbluefantasy
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-21T15:34:43+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-21T16:15:22+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sara/サラ (Granblue Fantasy) ===================================== This is the dataset of sara/サラ (Granblue Fantasy), containing 207 images and their tags. The core tags of this character are 'long\_hair, orange\_hair, hairband, breasts, blue\_eyes, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]